Google Cloud Platform Tutorial for Beginners

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so let's discuss about what google cloud is how to create a google cloud free trial account and concepts specifically related to google cloud are jones and region so in a one liner i would say a google cloud is nothing but 200 plus different cloud computing services in a areas like a compute engine related service storage networking we have a database machine learning ai services dev tools serverless products plus containerized application you can deploy so combine majority of all categories you have a 200 plus different cloud computing services and that will be referred as a google cloud next one a google cloud platform account so let's go to a website of google cloud so currently i am at a home page of google cloud platform you can have a look at if you go to the products there are huge lists of product has been listed depending on different categories like compute as i told analytics we have internet of things we have migration product we have a hybrid multi-cloud products and many more things so you can have a look at you can check out all the 100 plus product from here now as far as creating account on a google cloud concern for all new customers google will give you a 300 free credit which you can spend on next three months so to create this account you can just simply search for gcp free trial go there and just simply provide your credit card after clicking on get free started in this particular case i have already logged in with one of my active account gcp tutorial dot 2021 gmail.com and this is the account which we are gonna use throughout our whole course if possible i will try to make one video which will tell us a step by step process how to create a google cloud account and the last one in this video will be a gcp region and a zone so let me again go back to global location in a gcb you can just simply refer to this particular link cloud.google.com slash about slash location so regions regions are nothing but independent geographical area so if you see at a world map something like a delhi that is a region tokyo is one region iowa is region montreal is one region sao paulo is one region so that is like a geographical area the way we address individual country individual city same way google addressing independent geographical area as a region now in this region google keep their data center and those data centers are referred as a joints so if let's say if you take warsaw so verso contains two or maybe three different zone frankfurt frankfurt has a tour tree zone when i say john that means there are this much number of data centers are available where all your compute storage database exists all google cloud computing service will be served from those data center so in one liner zones are very specific data center and regions are nothing but where your data center reside so google will always ask you which region you want to deploy your service and among that region which particular zone you want to deploy your service if you just go down you can have a look at in america's europe that is a very high level geographical area they have combined and for which region which are the services being offered all right let me show you a gcp region and a jones name so related to one of the first service compute engine if you just go down you can have a look at this is considered as a john so whenever things end with a some alpha character abc it is a zone apart from that all other are a region so that is the asia northeast 3 b and that is the part of seoul south korea all right everyone so that is a very basic and high level overview about what google cloud is how to create a google cloud account although i'll just skip one video related to google cloud and what regions and zone inside the google cloud is hello and welcome everyone so in this video we'll see how to create this free trial google cloud account so i am already having a one standard google account which i am gonna use it so to get started with creation of your google cloud free trial account which is going to cost you just your credit card and you will get 300 of credit to spend for next three months so if you go to google cloud free you can have a look at google cloud free trial you can get started for free 20 plus free products 300 dollars of credit which you can spend for next three months as of recording this video in future they may change this thing now for each product they have provided some limit that how much amount you can use it completely free of cost okay so let's get started get started for free or you can just simply sign in now if you have a google account you can simply use this google account to sign in or you can create a new google account i'm going to use the account let's say gcb tutorial dot 2021 at gmail.com provide your password next it will redirect to the cloud.google.com page now from here you can go to console or you can just directly go to console.cloud.google.com and it will pop up one sign in screen all right so what it says that let me close all this call out we don't need there are few things you need to provide at the first shot provide your country in my case it's india you can always select your respective country accept terms of services agree and continue on top you can have a look at there is some notification like your free trial account is waiting activate now to get 300 credit to explore let's click on try for free now there are two to three step process first one will be account information so country i have selected india what is the best describe your organization need so i'm just a very simple soul trader so i'm just going to select this for my personal project or let's say we'll select a carrier experience getting certified select all these terms of services continue confirm your number okay account type so it will be an individual account type provide your credit card detail if you want to provide your tax information you can provide it here or just simply click after providing your credit card start my free trial oops it's a registered individual or unregistered individual so it's a unregistered individual let me click upon start my free trial so it's just setting up a billing information now depending on which credit card or debit card you are using from which finance company you need to do all this verification process continue all right so you can have a look at this form if you want to fill up you can always fill for what purpose you are going to use let me close it now let me go to my home let me dismiss it and you can have a look at our google cloud free trial account is ready how we can see we can go to billing from our hamburger menu and you can have a look at now i have a 300 equivalent credits available in my own country currency and i am having a 91 days remaining so as of recording this video it's a currently 19 september so on 19 december it is going to expire all right everyone so let me refresh the whole page and on top also you will get the same information which i just want to show you knobs because yeah we got so free trial status you have this much credits are available with 91 days remaining with a full account access let me dismiss it after this you need to pay for it all right everyone so that's a very simple two-step process provide your credit card information and some account information and bang on you can easily get started with the google cloud account that's all about this video and i'll catch you in the next one so in this video we'll see what is google cloud platform project and what is billing account so let me go to my browser and first ever thing is we are going to log into google cloud account so for that we can just straight away go to console.cloud.google.com provide your credential with which you have logged in so i have a many email address one of the active one currently is a gcp tutorial dot 2021 gmail.com and with this email id only we are gonna work throughout this whole course all right so i am at a dashboard of google cloud platform and currently you can have a look at my first project has been set so let's create our new project for our google cloud create scores so just click upon on a top panel bar projects it will list down the recent project if any of the project you have already started or all the project list will be listed here now currently this is not an organization based account so by default it's a no organization and these are some of the project i created for my own practice purpose and for my other gcp course like a terraform gcp okay so let me create a new project now projects are something like a big umbrella under which every single resource reside so all the resource in a google cloud platform which you provision it has to be tied with project so let's create google cloud craze course good one now this is just for the display purpose for the human but then there is a one project id which automatically gets created and which you just cannot change it later on so that is the unique identification for our project so let's continue and create it so project gets created if you go to top panel bar all we have a google cloud crest course select this project and you can have a look at now currently the project has been set to google cloud crash course now whatever resource we are going to create that will be part of this project only if you go from the navigation mode on the left hand side there is a billing so with every single project there has to be one building should be associated now currently there is no billing account got attached so let me link a new billing account currently there is already one billing account got attached so let me select a billing account so this billing account is already exist so let me set this account now before setting it i just want to go to manage billing account you can have a look at this account we haven't created yet but when we create an account by default there is a my first project and this default billing account always gets attached with one single billing account you can attach multiple project but all the single project has to be attached with billing account otherwise google will not just simply allow to create any resource inside their project if you go to my project these are the projects are available you can have a look at for all other three projects there is a billing got associated but with this project with just now we have created billing is currently disabled so let me change the billing you can do it from here also let me select it and set account okay what says that you have reached the limit of project on which you can enable the billing because already there are three projects with which the billing is already attached so what we are going to do we can disable billing on this my first project let's disable it so my first project also billing got disabled let me enable billing for google cloud credit scores let me set account all right i think bang on we are done with that our project setup our billing account is also set up so from the next video we can start provisioning service inside that i'll catch you in the next video hello and welcome everyone so let's discuss about the virtual private cloud inside the gcp or google cloud platform so let me go to cloud console and currently i am at a billing so let's go to the dashboard of google cloud platform you can have a look at currently the project has been set to my first project i'm just going to change it to google cloud create scores where we are going to provision all our resources so from navigation menu let me search for networking release product inside that we have a vpc scenario now why we are learning this vpc network first because if you don't have a vpc you just cannot create a compute engine instance and i just don't want to go with those default one you can have a look at it first asked to create a compute engine api enabled so let's enable it the moment you enable this compute engine api there is a by default one default virtual private cloud will be created inside our project and there are other ways also you can create a virtual private cloud so there is one custom virtual private cloud you can create it and there is a one by default being created by this compute engine api so there is a good connection exists between networking products and remaining all other products available inside the gcp specifically a compute engine product so in background there are a lot of things are going on one default vpc it is creating all right you can have a look at it has redirected us to the vpc network screen and there is one default vpc being already created inside this default vpc they have created one subnet for each region so currently there are more than yeah total 28 subnets got created that means there are total 28 regions exist now corresponding to each individual region they have created one subnet and inside which we can create our resources so that's why i took this first now this is the default vpc it has created i'm going to show you how to create a vpc network of your own so there will be one auto vpc mode and there is one more like a custom vpc mod let me go ahead with the auto vpc now why this is auto vpc because this subnet creation will be completely automated so you can have a look at just like a default vpc we are going to create another vpc but now it's a manual process and all the subnets will be created automatically now don't get confused there are only 15 will be shown here but once we create this vpc it will show all the 28th region now let me select all the file value which will be automatically gets applied and let me create it so this is the auto vpc still it's creating we are going to create one more vpc so now this is custom vpc for which we are not going to use this automated mode for subnet creation instead of that let's just continue with the customer so it will not create any subnet automatically instead of that we need to create a subnet ourselves so let me create some subnet let's say in let's say sg that means in a singapore region we want to create it so for singapore region the corresponding region name will be acr south is one ip range so some particular ip range we can provide it why we need to provide it because now once we create any of the resource one of the ip address from this particular range will be allocated to your resource and let's continue ahead with the default option you can add more subnet also now this vpc we are going to use while creation of compute engine in a later videos still it's creating let me refresh yeah you can have a look at auto vpc is created custom vpc is still creating but in case of auto vpc same like a default we have a 28 subnet here also we have 28 subnet so exactly replica of default manually we have created nothing special about it but when this custom vpc will be provisioned you will be able to see we have just one subnet so with using this vpc network you can create a resource just in asia southeast one region a subnet sg only and this is the ip address range we have manually given all right that is the virtual private cloud i'll catch you in the next video so let's discuss about the ip addresses inside the google cloud networking products so mainly there are two kinds of ip address one is the ephemeral and other one will be a static one so fmrs are something like when you attach this kind of ip address to your compute engine resource once the compute engine restart this ip address will change so you just cannot rely on that particular ip address when you go live you need some ip address which is always constant in nature that is nothing but a static ip address so ephemeral ipads you just do not need to create it but static ip you need to reserve yourself so if you go to vpc network currently we are inside the vpc network we have one more tabs are available external ip address so from here you can resolve a static ip address now this static ip address are costly once you create it it is going to cost you from your free trial credit so just make sure if you are not going to use it just simply delete it if it is attached it will cost little less if you are not using you need to pay even higher amount okay network service tier let's continue with all other default option in which region you want to create it so we have already created one subnet in a singapore so let's continue with the singapore asia southeast one region and let me reserve it now with every single service or all majority of service when you try to finally provisioning your services there is an equivalent command line you will be able to see how you can create the same resource via command line so this is a good practice that when you are creating resource from the cloud console always look at what is the command line utility it is using so to create a reserved external ip address it is going to use this gcloud compute address create then just provide your address name under which project you are creating and region in which you are creating okay let me close it and let's reserve it now make sure if you are not going to use simply select it and release this static ip address now we are going to use it inside our compute engine resource so i am not going to release it for now you can have a look at in use by none it is not being used in any of the service not tied with any service but later on we'll try to tidy okay i'll catch you in the next video so let's discuss about the cloud identity roles and permission so inside the google cloud there are five different ways you can enter it like you can use your google standard account to create a google cloud account or you can use the google groups or you can use the service account service account don't worry about it in the next video we'll have a detailed discussion on it or you can use the workspace or a cloud identity domain account now last two are specifically highly related to organization attacks so when you have some kind of domain you can attach it and all the projects all the folders resource hierarchy which you will create that is a part of those organization service account is mainly used when application wants to communicate with another application so that is not for the human but you can use with some keys or i would say token google standard account so this is the account which we have created and logged in inside the google cloud platform and google groups are nothing but a collection of google account so rather than assigning role to individual user you can provide a role to the group and all those role will be applied to every single user who is a part of this group so there are five main different ways you can connect with google cloud now let me go to my cloud console and let me go to the dashboard to manage all those identity and access management related stuff you can go to im and admin this is the identity another thing is roles so by default google has created huge number of roles so if you go to on the left hand side the roles tab you will be able to see for different services there are more than 800 plus role has been already created and this number is keep on growing based on different services google will release so something like let's say apg product we have this much roles are available if you go to let's say some other page we have a bare metal role we have big query literate role we have a billing account related role so google has already created such a kind of role now if you go to any one of them let me go to some cloud deploy admin that means there are some collection of permission are available like what this role can do so these are the permission got attached but this permission you just cannot assign directly to any google account or i would say google cloud account so combine of this particular permission will create a role and this role you can attach with individual user now these are all rules you will get and by default google has created for you but you can definitely create your custom role from here add all those permission which you require in that role and create your custom role so in this video there are three things we have seen one is identity what is the count what is the roles roles are nothing but a collection of permission in the next video we'll see how to attach both of them hello and welcome everyone so in this video we are going to discuss about the service account and a role assignment so service account are like not for the human it is for the non-human user when application wants to communicate with another application so let me go to my cloud console you can have a look at on the top right corner currently i have logged in with this particular user so that is a standard google account with this account i can create lot many things but let's say if some application wants to do something inside your google cloud platform in that case you can use the service account and currently by default there is a one service account being created which is nothing but a compute engine default service account now how this service account got automatically gets created when we enable a compute engine api one default service account it will automatically create so let's see how we can create our own service account let's say service account name will give like a gcp grace course and this is our service account id create and continue as of now i am not going to assign any particular role or we can just simply assign some particular role let's say to this particular service account i just want to assign from the basic some viewer role so it has a read-only access to all the resource available inside the google cloud platform you can have a look at ray scores got created and it has been assigned one role now let me copy this one okay then we are going to go to import now at this level you can attach or assign different rule to individual user either it will be a google standard account two google groups or google cloud identity workspace domain account or service account so everything attachment or a binding will happen from this particular tab so let's do one thing let's add one more binding so currently these are the by default binding already being created and you can have a look at our main account gcp tutorial dot 2021 it has owner access that means he can do every single thing with our account let me add now i have one more my personal google account and let's say to this particular user ankit.25878 gmail.com i want to provide you can have a look at this the valid gmail id or i would say google account that's why it has automatically given a suggestion let me select the road now there is a huge list of roles are available if you have created any custom role those role also will be listed here we haven't created any kind of custom role so it will not be available now let's see to this particular user i want to provide a kubernetes engine admin rule i want this user to manage all my kubernetes resource simply provide this kubernetes engine admin rule and save it you can have a look at one role got assigned to this user now let's say i want to put this particular employee let's say i am manager and i want to assign some more role so just simply add it from here let's say i want this user to work upon some big query also so we have a bigquery admin role so i'm just gonna assign to bigquery admin or let's say just full access to all data sets and their content so bigquery data owner i want to shiny so depending on bare minimum requirement you can assign some particular rule so this role we have assigned to google standard account same way you can assign to service account also so this is the service account which we have just now created same like a standard account you can attach some particular role let's say we want to assign editor role to this service account and let me save it next thing is if you go to service account with every single service account the key is associated but you need to manually generate those key and this key you can use let's say when you are trying to interact your application via some programmatically like from python code or from java code or let's say from node.js code you can use those key so to generate the keys you can go to this three dot click upon it manage keys and very simple you can just simply add key create a new key select json and create it and it will download one json file which you can definitely use it so that your service account will be authenticated all right everyone that is about the service account and how to assign a role to individual google cloud identity i'll catch you in the next video so let's create a virtual machine a compute engine resources inside the google cloud platform and for that we are going to use both of this networking concept plus a service account which we have used so let me go to my cloud console and our compute engine api is already enabled so from our navigation menu let me go to compute engine and now it will not redirect us to enable a compute engine api so currently there is no instance available so let me quickly create a new instance let's say gcp demo the name i'll provide it region where you want to create it so we have already created one subnet inside our singapore region so i'm just going to continue with the let's say instead of singapore we'll continue with the u.s central one that is the region so that is the independent geographical area now john so zones are nothing but a specific data center inside the region so you can have a look at u.s central one contains your central one a b c and f region select any one of them and go ahead next one is what machine configuration you require so you can select based on your workload either it's a compute optimize memory optimize or gpu specific machine configuration let's continue with the very simple a general purpose one and inside the general purpose we have a first generation machines are available and a second generation so let's go ahead with the very first generation n1 series virtual machine here you can select how much amount of virtual cpu you require and how much amount of ram you require so g1 small one virtual gpu and 1.7 gigs of ram is a good enough for us let's go ahead do you want to deploy container inside this virtual machine no if you want to deploy you can select here what disk which operating system you want to boot with so let me change it so you can have a look at we have a huge number of operating system being currently supported like all the linux base of famous version like centos debian we are open to red hat fedora let me continue with the debian one and debian 9 how much storage you want 10 gb is good enough let me select it next one identity and api access so there is one compute engine default service account and which is already by default created when we enable a compute engine api but apart from that we have manually created in earlier videos a gcp crash course service account so if you want to use your custom service account you can select from here and now your virtual machine is going to get authenticated via this service account only your virtual machine gets identity of this service account you want to host some application which allows http traffic in that case you can select this one important part is networking so in case of networking there is a already by default one default network interface let's say this interface we don't need it so in that case from the drop down we have already created multiple subnet so one is custom vpc and you can have a look at immediately we got error that you must select a sub network and there is no drop down because inside this custom vpc there is no sub network exist for us central one we have created this sub network only in singapore region so let me select the singapore region so now you can realize that what is the importance of this network without networking you just cannot create resource so this is the singapore region and inside the singapore region is here southeast 1b where we want to provision our compute engine resource and now you can have a look at singapore got automatically gets selected external ip address so we have already created one static one done any extra disk you want to attach you can attach from here so for timing let me continue with all other remaining default option and let's create our virtual machine so virtual machine creation inside the google cloud platform is very fast within a matter of maximum 30 second it will provision a virtual machine resource for us let me go from more refresh yeah you can have a look at this instance is running and this is the internal ip address has been attached this is the external ip address has been attached if you want to go inside this machine you can ssh into it and full fledge machine will be available to you so do you want to initiate ssh connection to our vm instance yes connect now i don't want to go as of now inside it this is the full flash virtual machine you can directly play with that all right everyone so that is how quickly you can create a virtual machine resource inside the google cloud platform so so far we have created a virtual machine we have created one external ip address and that is going to cost you so i would highly suggest you if you are not going to use it simply select it and from this three dot simply delete your instance so that is like a releasing resource if you are not going to use it and there is one more like from vpc network we have a external ip address so let me release this external ipls which we are not going to use it so currently it is being used inside this virtual machine instance so that's why this release static address is currently disable option let me refresh it and you can have a look at our machine got deleted so now we can release this static ip address let me delete it all right everyone so that is how you can release your resource or i would say destroy your resource after you are not going to use those particular resource inside the cloud otherwise it will just simply wipe out all your free credit so after every single service if you are not going to use all those resource just simply delete it there is always delete option is available to remove it now removing this virtual private cloud just simply doesn't make sense it is not going to cost anything to us it will just simply recite we are not using it anywhere so in this video we will see how to deploy application inside the google app engine so earlier we have seen about the is solution infrastructure as a service where you have been given a complete machine and you can do whatever you want to do with that machine but now there are a lot of things google will take care specifically a server part or server management part traffic splitting part and a lot other things so you just need to focus as a developer your code and just simply deploy it everything will just start working it will auto scale it will auto heal you don't need to worry about what internally google does for you so let's see how to deploy your very simple hello world kind of application to google app engine let me go to my cloud console and for google app engine per project you can create one single application so from the serverless category app engine you can have a look at this is the part of serverless category that means there is no server that exists that doesn't mean that physically or virtually there is no server its only thing is that as a developer you do not need to manage those server google will manage for you let me close this panel bar and let's create app engine application first ever thing is you need to select in which region you want to create your app engine deployment so this region is permanent once you select it later you just cannot change it so you need to create a new project and you need to deploy so what we'll do we are going to continue with the let's say us central that's good enough for us let me press next and meanwhile it will setting up this region let me show you the code which we are going to deploy so we are going to use this app.yaml let me open it so it is going to internally using node.js as a runtime in environment package.json so that will contain how you are going to run your node application where there is only one liner code has been written like a node server.js this server.js is nothing but let me edit it it is going to accept some incoming request in a form of http and in written we are just going to give response as hello world from ga or google app engine and it will listen on ati port so this is the tree file which we are going to upload it to google cloud platform okay now let's just check our application has been set up you can have a look at apart from node we have lot of other runtimes being supported like python java go php so not hard requirement at this level you need to select it instead of that let me click upon activate cloud shell so this is the free resource google will provide you with a limited duration per week of writing around 50 hours so this is a full-fledged machine you can use for the development purpose so here we are gonna upload our code this machine we haven't provisioned it but with every single account you will get one so how we are gonna upload it so from more let me upload choose the file and this reside inside my download folders some backup and we have a gi code so let me select all three alright already file gets uploaded if i just see ls see everything will reside here what i can do next simply gcloud command line utility which is by default installed here app and after that let's say if you don't understand anything just simply put help so gcloud app we are going to simply deploy it so we have option of gcloud app deploy so let me come out from here authorize it because this cloud cell is making api call with services so this is the default option and let me press yes y and let's just wait for some time it's uploading all the code once this deployment will be finished i'll get back to you all right we got some issue so what it says that invalid runtime node.js so you need to select some proper runtime so i think we can go ahead with a node.js14 we can just simply go to app.yml file and let me change it to 40 let me save this file and let's again do gcloud app deploy yes invalid argument in a package.json let me open it yeah so that is a 14 version so i'm just going to change it to let's say some 40 yes then and you can have a look at updating services so once the deployment will be completed i'll get back to you deployment will take couple of minutes all right so you can have a look at our code has been successfully deployed and to view there is a url has been given so let me copy this url and moment you select inside this cloud cell it is already being copied let me go to another tap and you can have a look at we got hello well from g that means our application has been successfully deployed if you go from here and go to services you will be able to see this particular service got deployed if you see the version there is a one version of this service being deployed who is serving 100 percentage of traffic all right everyone that is about the google app engine deployment google cloud function so another serverless product inside the google cloud platform so mainly people use the cloud function when you want to execute let's say some very small task or very small workload you want to deploy like you can deploy your code as a one single function and this function you can trigger via different matters like http or let's say via some pub sub topic or when you upload some file inside the google cloud storage we are going to see via http way so let me go to my cloud console and let's just quickly deploy a very simple hello world function so from navigation menu let me check to the cloud function and it's getting something ready in background so cloud functions are something like a very lightweight event base and asynchronous compute solution let me create a new function let's say we'll give gcp function 1 in which region let's continue with the u.s central one how you want to trigger it so the as i told there are lot of ways you can trigger let's continue with http and based on the name we have provided to our cloud function this particular url you can use it to invoke function let's allow all unauthenticated invocation so that will be publicly allowed let me save it runtime how much amount of memory you want to allocate which service account you want to use it how many maximum instance of this function you want to deploy let's say 5 good enough and let me place next here there are a lot of runtime being supported just like a google app engine so we have go support java support node.js support python php and ruby support let's continue with our default node.js and here there is a simple hello function has been written and this hello world function we are going to deploy as a function inside the google cloud function service so hello from function if you let's say rename this function to hello world to something else you need to provide here the entry point that when you execute this particular function or i would say when you invoke this particular function which particular function inside your code is to be executed let me deploy it so deployment will take little time so once our cloud function is ready we'll test it all right so after some two minutes our function is successfully got deployed so let's go inside and let's test this particular function how we are going to trigger it so let me go to trigger and you can have a look at the same url is available if you click upon it it will open in another tab and we got output hello from function all right everyone that is about how to deploy your google cloud function via cloud console hello and welcome everyone so in this video we'll see about the docker one of the very simple node.js based application which we are going to package it as docker images and after that we will see another products like a container registry where we are going to push this docker image so later on for other product deployment like either it will be a cloud run or kubernetes engine where we can deploy this docker images so container registries are something like a docker hub where you can push and store all your docker images inside the google cloud platform so let me first go to cloud console and let's go to container registry so container registry are the part of ci cd now currently google has released its advanced version of container registry where users cannot store docker images but there are other kinds of artifacts also in a general way you can store it like a maven python base package management anything you can store it here so container registry if you go to container registry there are currently two images being uploaded and that we didn't upload anything because we deploy our google cloud function and app engine product so they are internally using as a staging purpose this particular container registry what we are going to do i have a one very simple code where there is one server.js file let me open it and you can have a look at it's a very simple one like what we use for our google cloud app engine product it just accepts http request and in written hello world 2 let's let me remove this kubernetes part anyhow we are going to deploy to the kubernetes later on so what we'll do let's say hello docker okay it will listen on a 8080 port and return this particular string hello docker now to create this docker images we have a docker file is available so if you open this docker file it is internally using base image as a node 16 alp line 3.11 because that's a very lightweight image we are exposing this 80d port and copy this server.js file from our machine where docker has been installed to the image which we are going to create it and after that it is going to just simply start this node server.js that means server.js file will be in execution okay without much further ado now first ever thing is let me open my cloud shell all right cloud cell is open and currently i am inside my google cloud crease course project so first ever thing is let me create one folder and let's upload both of this file or we can upload even full folder also it let me upload it this will upload all files from container that's perfectly fine so it is going to create container folder for us and we'll keep both of this file inside so we do not need to bother about creating this docker let's just test it yeah we have a container so let me go inside the container and let's see both this file exist yeah so now we are going to build image and we are going to push it to the container registry and for that we have a gcloud command line utility is available so gcloud now which particular another product which we are going to use to build this docker image so that will be a builds submit now this particular command not only just build docker images but it is going to push this docker images to the container registry also minus t that is for giving the tag and what is the name of docker image you want to create it so that will be a gcr dot io slash and provide your project id so that will be a unique identification and afterwards we are going to provide our docker image name let's say that will be a my first image and after colon if you want to provide some kind of tag you can use it all right version 1.0 let's go ahead authorize it so this is going to first download this node base version operating system and afterwards it will package it to the docker image see build done no it's still doing so after build the image will be created and it will push it to the container registry so it's pushing done so let me close it and let's refresh the repository and let's see yeah you can have a look at my first image and this is the docker image is currently available having a size 38.7 mb all right everyone so that is about how to create a docker image and push it to the container registry in the next video we are going to deploy this one to the different products like cloud run and kubernetes engine i'll catch you in the next video hello and welcome everyone so in the last video we have successfully created a docker images and push it to the container registry now we are going to use the same image to deploy to the another service or google cloud run now google cloud run is nothing but to deploy your containerized workload and we have already containerized our workload into one single docker image so let me go to my cloud console okay this cloud shell not required so let me minimize it and first everything is let me open it in another tab we'll go to a cloud run now this cloud run is a part of serverless product okay let's create a cloud service container image url so let me select it and you can have a look at our my first image having just one version is currently available so let me select it okay service name my first image run which region you want to deploy let's continue with the u.s central one there are regions where google claims that it has a very low co2 carbon emission in their data center okay cpu allocation and pricing so cpu is only allocated during the request processing or cpu is always allocated so there is a new option google has brought us inside this cloud run we'll continue with on demand okay minimum number of instance let's go with zero if there is no traffic it has to be zero maximum let's say three okay any other advanced setting so where it is going to run so it will be on a 88 port our memory so i think 128 also more than sufficient one virtual cpu request time out so map maximum it can go to 3600 so beyond that it will just simply timeout and per container maximum how many requests it can handle so 80. next allow all traffic and allow on authenticated invocation also and let me create it so it is going to deploy a first revision inside this google cloud run so creating revision and routing traffic is pending so i am just fast forwarding my video till this whole deployment will be completed alright so you can have a look at after almost around one to two minute it took to successfully deployed our application so they have provided one url so let's just go to this url and let's test our application so i'm just going to open it in another tab and successfully we got hello docker that means our application has been successfully deployed now if you are not going to use it so just simply delete it so select it and delete all right everyone so that is about how to deploy docker images inside the cloud run services in the next video we'll see how to deploy the same image to the kubernetes cluster hello and welcome everyone so in this video we are going to deploy a same docker images to the kubernetes cluster so there are three step process here we need to follow first is we need to create a kubernetes cluster next one we are going to deploy this app but even after deployment you just cannot use it you need to expose it so last step will be you need to expose your workload as a services so let's see all three steps in action so for that let's just go to kubernetes so first everything you need to enable a kubernetes engine api all right so api is enabled let's create a kubernetes cluster there are two modes standard in autopilot we'll continue with the standard one i'm going to continue with majority of all default option so we are going to continue with the journal one not a regional one we have a what version of control plane so that will be a release channel or true static channel so let's continue with the default one the node how many worker nodes you require so definitely image type will be a containerized os that is the default one what is the machine family you require so let's go ahead with n1 and let's use little powerful machine let's say 2 virtual cpu and 7.5 gigs of ram how many this size you require so i think 20 gb is good enough maximum power for node 110 okay so that is the node configuration if you go to default node how many node you want to provision let's say if i make one is it allowed we'll continue with the one only in that case so what this is that clusters smaller than three node may experience downtime during the upgrade that's perfectly fine we are learning here and no other option okay let's create our cluster with just one single node we took a little powerful machine so two virtual cpu and a 7.5 gigs of ram so cluster creation will take little time so i am just fast forwarding my video till this cluster will be ready for us all right so cluster has been ready next thing is inside this cluster clusters are nothing but a group of machine you can say where kubernetes and its component has been deployed we have just one node cluster because we are doing for the learning purpose let's go to workload and let's deploy it okay which image we are going to deploy so let me select it from our container registry select which we have pushed anything else let's continue so only one container we are going to deploy application so my first cube app okay and let's continue with all default one inside which cluster you want to deploy definitely we have just one cluster let's deploy all right so you can have a look at deployment is finished that means our docker images has been successfully deployed but still from outside you just cannot use it for that you need to expose it so you can go to services and create a new service or from workload only you can just simply expose it okay target port so port where your container is going to run or going to accept the request so inside our node.js code we wrote 8080 load balancer type and is our service name so let's just expose it all right so you can have a look at our service has been successfully exposed let's go to external endpoint this one so our external endpoint which is having this external ip address and the port number 80 from which we are going to access our application so let me just simply click here and bang on we got output hello world docker that means our image successfully got deployed on a kubernetes cluster now next thing is if you are not going to use it let me show you how you are going to delete it so first you simply delete this service so let me delete it next you just remove the workload so let me select workload and at the last you delete a cluster all right everyone so that is about how to deploy your application to the kubernetes cluster all right everyone so in this video we are going to learn about the google cloud storage one of the storage where you can dump anything unstructured data storage solution available inside the google cloud platform you can store here any kind of data like image data video data binary file zip file you name it anything you can simply store it and via http rest api you can simply access it so let's see how we are going to create a bucket inside this google cloud storage and upload some object inside the google cloud story so that is the blob solution available inside the gcp from navigation menu let me go to storage and we have a cloud storage so currently in this particular project this many buckets already being created by google this particular product google cloud storage mainly used for any kind of story and many of the other services they use this product for the staging purpose now you just cannot create any kind of object directly anywhere for that you need some bucket so let's first create a bucket and inside we are gonna upload some file so bucket name bucket name has to be globally unique so let's say if i just keep test definitely someone has already taken this test so you just cannot give it so what we'll do google grace course now what this is that cannot start with a group so you just cannot use the google so i'm just going to remove this one craze course i think not accepted so cresco's underscore let me take 205 good enough i think you want to provide some label let's say environment in which we are going to use it so that is just the testing let me continue okay location type you want to keep your data at a multi-regional level dual region or a regional level in case of multi-region definitely you are putting your data at a multiple location so you will get a highest availability across largest geographical area in case of regional a small region you are going to cover it so for our learning purpose let's continue with the small one let's go ahead with somewhere nearest to my location let's say mumbai let me continue storage class so it will be either standard near line cold line or archive so that is a nice description has been given in a one liner like when your data access frequency will be very high you use standard one when your data access frequency is going down you go from standard to near line to code line or archive archive will be very best when you want to preserve data for the long term and you want to access let's say just once in a year or you want to access one cinema you use near line once in a quarter you use code line so from standard to archive your data access cost will be very high but your storage cost will reduce so depending on your business requirement you can select individual storage class even if you select something like standard later on with lifecycle rule you can change it to near line or a cold line after some time or even you can manually also change the storage class so nothing to worry about it let me continue with the standard one uniform label access for all objects inside the bucket enforce public access prevention but we want to access it publicly so i am not going to select this option uniform access control and let's continue with all other default option okay bucket gets created crease scores underscore 205 now let's upload some file inside it so we have a docker server.js i'm going to upload some let's say image do i have an image inside the download folder let me go to my backup yeah this is my own image i can try with this one upload started and file successfully got uploaded let me go inside and you can observe the properties of individual object how you can access it so just simply click upon this authenticated url and bang on we are able to access whatever file we have uploaded so that is how google cloud storage works now inside this bucket you can upload any kind of file image video binary data anything so that is like a blob storage solution available inside the google cloud platform that is another kind of way you can store your data that is block storage so in the next video we'll learn about how to store your data as a block storage not as a blob storage i'll catch you in a next video hello and welcome everyone so in the last video we have learned about a blob storage solution let me write it here so you won't get any kind of confusion so that will be a blob storage now we are going to see about the block storage solution so let me go to my cloud console to create block storage you need to go to a compute engine now with compute engine earlier what we have created we attach some disk and those disks are nothing but a block storage solution same way if you want to create some disk you can go to compute engine disk and let me create a disk now from this disk again you can create a new virtual machine so this disk are nothing but a block storage solution okay disk let me let me continue with the disk one that's perfectly fine you want this disk in a single zone or you want this disk in a regional level in case of regional label if something goes wrong you can always use a failover replica and redirect your traffic to another replica so let's continue with the single zone select specific zone where you want to create this disk okay next one will be what is the source of your disk that means you want to use some image or some snapshot or you want to just simply allocate space with a blank disk let's contribute a blank one this type so it can be a balanced persistent disk extreme persistent disk ssd or a standard persistent disk ssd will definitely cost little high you can always go to this particular link to check the documentation behind that so you can have a look at you keep your size which machine you are going to use it and depending on that they will give you nice comparison about the read and write capacity of each four standard balance ssd and extreme now definitely in case of ssd you will get a very high throughput 15 000 you can have a look at that is not a case with other tree okay so let's continue with ssd persistent disk and i think 10 gb is good enough for us you want to take some snapshot of this particular disk at some periodic interval you can select this one and select a snapshot schedule now this schedule is a by default got created or you can create your own schedule also that on every single day between this time and this time you can always take a snapshot of my disk so even if something goes wrong with your disk from snapshot again you can recreate the disk okay we don't want to take any kind of snapshot scheduling and last one will be encryption that how you want to encrypt your data so there are three major encryption technique being supported inside the google one is google and manage encryption key where you do not need to manage anything next one is the customer manage encryption key so here you are putting all your keys inside another google cloud service or google cloud key management service and last one will be customer supplied encryption key so here you are not just managing but you are supplying all your keys after generating at your local site so let's continue with the default no configuration is required google manage encryption key and let me create it so this is going to allocate some disk for us having a size i think 20 gb we try to allocate it let me refresh it and you can yeah not 20 gb it was 10 gb only so disk has been created but still it is not attached with any virtual machine so you can click on three dot and you can just simply create an instance out of it so this disk will be attached to virtual machine instance you can have a look at existing disk will be disk one all right so we don't want to go towards the compute engine for now and if you are not going to use it simply delete it otherwise it is going to cost from your free trial credit but this is how you can allocate space for block storage when you want to attach with a virtual machine all right everyone that is all about this video and i'll catch you in the next one so in this video we are gonna discuss about the data migration what are the different techniques are available for migrating your data from anywhere mostly either it will be from your on-premise data center or from any other public cloud to the google cloud because if you do not bring the data inside the google cloud there is no point to go ahead so let's see what are the different technique being offered by google cloud for data migration from outside world inside the google cloud platform so let me go to my cloud console and oops let me search for data transfer inside the storage so let's go to data transfer so there are mainly three categories are available for data transfer one is data transfer service from cloud transfer appliances and there is one on premises so let's see all of three one by one that what is the objective behind each individual one so let's see first one create transfer job from transfer service for cloud data that means if your data already recite somewhere in other cloud so let's say source will be either amazon s3 bucket or azu storage container or somewhere specifically at some url or you want to transfer your data from one buckets of google cloud storage itself to another google cloud storage bucket so in that case you can always use the first option if i go back there is a one nice one liner they have given like transfer service enables you to quickly and securely transfer your data into google cloud storage from varieties of online sources such as amazon s3 blob storage or you want to move data between two buckets so that is the story behind how you bring your data from other public clouds to the google cloud environment next one will be on premises so on premises that means from your data center how you can securely transfer your data to the google cloud platform so let me create a transfer job and for that you need to install some agent in your local data center and this agent will help us to transfer your file after you create a connection to the google cloud platform here every single time you can schedule some kind of job also so not necessary that on demand only it will run you can schedule it so after every periodic interval it will run its job and last one will be our transfer appliances so when you want to transfer huge huge amount of data let's say a petabyte scale data sometimes even 20 40 hundreds of terabytes even and you don't have a sufficient bandwidth so in that particular case it is not a recommended practice to transfer online option something like your gsutil command line utility your gcloud command line stuff so in that particular case this particular option transfer appliances will help us to quickly transfer your data google will send you some transfer appliances you put all your data inside it and just simply clear it to the google cloud platform data center so what they say is that transfer appliances is recommended when your data exceed 20 terabytes and if it takes more than one week to upload from your online bandwidth it is always recommended you ask for transfer appliances and put all your data inside it all right everyone so when you want to transfer the huge amount of data one can use or transfer appliances so this is the major three stuffs are available to transfer your data from different sources to the google cloud platform google cloud sql so when your existing application inside your own on-premise data center if your application let's say using a traditional open source database engine like a mysql or a postgre or even commercial one like ms sql and you want to lift and shift easily to the google cloud platform one can easily use a cloud sql so let me show you how we can quickly create a cloud sql instance so let's go to a database category and inside the database we have a sql so currently there is no instance is running so let's create our first cloud sql instance and you can have a look at they are asking us to choose your database engine so currently there are three database engine being supported mysql postgresql or a sql server let me choose a mysql let's say i'll just provide my sequel one you want to provide some password let's say root password i'll just provide it with database version you want to use it let's continue the mysql 5.7 you want a journal availability in a single zone or in a multiple zone definitely multiple zone we have highly available let's continue with the single one in u.s central one iowa it will automatically decide inside the which zone of this particular region it will create this cloud sql instance and depending on whatever different options we are selecting you can always see the summary on the right hand side let's see any other configuration option one very important one which is machine type so what is the configuration of machine for this database instance you require let's continue with the simple one a shared core one virtual cpu and one point seven gigs of ram storage definitely as is this recommended we are going to continue with the symbol as is the 10 gb of storage public ipv we require backup maintenance so you can always configure how frequently you want to take a backup and as far as maintenance are concerned that is not under your control but you can always provide some maintenance window so that during those window maintenance time google will do all those maintenance activity related to your database instance okay let's quickly create a mysql instance or i would say cloud sql instance now creating this cloud sql instance is going to take good amount of time mostly it will take five to ten minutes so i'm just fast forwarding my video till this instance will be ready for us all right everyone so after some 10 minutes of time it took to create this mysql or i would say cloud sql instance for my sequel and our cloud sql instance is ready how we can connect it so just simply click upon connect to instance open cloud shell let me disk space this notification and for all client ip it is not by default allowed but when you do from cloud shell for some temporary time it is going to white list the ip address of this cloud shell so let me enter authorize it and you will be able to see it is going to whitelist something currently the permission is denied so we have to enable this api and i'm just going to open it in another tab so that will be a sql admin api let's enable it all right so let's try oops then let's try once again you can have a look at it is allowing or i would say whitelisting your ip address for incoming connection for 5 minutes and here you need to provide the root username password if you want to connect with some other user instead of this user flag root you can provide other one so let's just wait for some time till it ask us about the root password yeah it is asking so let's just provide and bang on we are connected if you see the default databases so these are the four databases are available if i just minimize it and go to database you will be able to see all these four database will appear here all right everyone so that is how one can connect to my sql instance and how you can provision a mysql instance inside the google cloud platform now i don't need it so i'm just going to simply click upon delete provide your id or i would say instance id and let's delete it all right everyone so that is about the vertically scalable but with limited compute resources you can provision a cloud sql instance there is another one like a horizontally scalable where you can do theoretically infinite scaling of your database in terms of compute in terms of storage so in the next video we'll see about a google cloud spanner google cloud spanner so spanner is another sql kind of solutions are available inside the google cloud platform and this is one of the in-house product developed by google so compared to your cloud sql spanner is way way better in terms of highly available in terms of scaling that is not possible in case of cloud sql because you can as in case of cloud sql you can always do vertical scaling like you can increase the capacity of your individual database instance but you just cannot go horizontal scaling way you just cannot add more node to increase the capacity of your total database instance so this thing are possible via spanner and compared to your cloud sql instance creation you will be amazed to see that how quickly you can create a spanner instance so let me go to my cloud console all right so i'm inside my cloud console let me go from here spanner so from database go to spanner it is horizontally scalable stuff let me enable a spanner api all right so spanner api is enabled let me create a first instance of google cloud spanner let's say spanner one we want to go at a regional level or multi-regional level definitely multi-region is going to cost lot so let's continue with the regional level which particular region let's say mumbai and you can have a look at for one hour it is going to cost you 1.26 dollars so it's very much costly processing unit we are going to continue with the node way so one node we are going to use it inside our spanner instance now you can have a look at there is only few parameters google has asked and i am just going to click upon this create and just see within a matter of five second our instance is ready so that's the beauty of this google cloud spanner in-house product developed by google let's create a database let's say it will be a university database and from here you can use the different ddl template so first everything we are going to create a table so let's say the table name so inside the university we are going to create a student table column name let's say s name will be a student what is the data type so name will be mostly string and we need to provide the size of string so let's provide some 100 characters one more fill let's say it will be h and we will make it integer 64 bit what is the primary key let's make it a name that is not a good one but still okay for our demo and let's create it so it is going to create a university db and inside the university db1 student table having this schema all right so we are inside the university database and we have one student table so this way you can explore yourself and go ahead insert some data inside this student table try to fetch the same data from here now this instance of google cloud spanner is a cost given so let me go inside and i'm just gonna delete it for now once your job done simply delete it otherwise for every single one hour it is going to cost you 1.26 dollar per hour basis all right everyone so our google cloud spanner instance is deleted i'll catch you in the next video cloud data store so it's a nosql solution available inside the google cloud platform so let's see how to configure it first of all it's a serverless product so you do not need to create any kind of instance in it next thing is this particular product data store is tightly coupled with google app engine so while creation of app engine we have already decided let me go inside the app engine inside of which region we are going to deploy our application so how we can get information about where we deployed our application so that is inside the u.s central one now if i just try to go let's say from database category we have a data store you can consider data stores are something like a mongodb kind of solution you can have a look at currently database location is us central so that is by default being selected now there are other products are also available like a firestore so firestore you can consider it like a advanced version of data store in some way so per project either you can use it firestore or you can use it datastore only so let me open firestore so by default in this particular project google cloud credit scores we have already selected data stored mode so if you go to firestore database we have firestore you will be able to see that it won't ask any question instead of that it will tell us that you have already selected a data store so you just cannot use a firestore see this project use another database service that is our data store so they are forcefully telling us that either you can use data store or firestore and currently we are inside the data store and along with the app engine they have already configured a database location for this data store that is nothing but us central so once you select the location for either google app engine or for data store that will be selected for other one automatically all right so we are inside the data store this is a serverless product so you don't need to worry about anything no need to manage anything just configure for your database location and just simply get started now what they say is that here switch to native mode you can switch to native mode also once you switch to native mode that will be converted to a firestore but once you start putting your data once you configure it later on you just cannot change it so for our case let's create some entities inside this data store so entity let's say entity we are going to create like a student and student having a one key identifier numerical id that will be automatically generated property let me put some very simple property s name let's say it will be a john and do you want to index this property then you can select this one let me add one more property let's say age of the john now age of dijon some integer let's say some 25 and i don't want to index this particular property i don't want to search based on age and let's create entity all right so you can have a look at we created one entity inside our student kind from here you can query stuff let's say very simple one like a sql select star from let's say student let me run the query so all the results inside the student kind will be retrieved so you can put more data and you can experiment with the different clause like where clause or write everyone so that is about the cloud data store cloud bigtable so cloud bigtable is another nosql solution available inside the google cloud platform and it is one of the proprietary product by google it's a white column database so you can compare this big table with apache's open source version of hbase okay so let's see how we can create an instance for big table now in terms of pricing the big table is equivalent to you can consider it like a cloud cloud spanner so once you do not require a big table you can just simply delete the instance and it is going to charge heavily from our free trial credit account if you don't delete so let me create a cloud bigtable instance let's say i'll give the name like a big table 1 as an instance name instance id id is always permanent you just cannot change it later so it has to be minimum 6 character let's say pick table hyphen 1 select your storage type let me select hdt or ssd is also good enough you can see we haven't selected anything and by default they are telling that it is going to cost you 468 per month that's a huge amount of money and no extra storage we have added let's say 10 gb so for 10 gb is going to cost 1.7 but a huge cost it is going to incur for one node of big table cluster let me continue now inside the instance you can configure multiple cluster so for our case let's continue with the one single cluster big table one hyphen c1 okay location let's say we want to do it inside the singapore any zone or let's select some specific zone how many node let's say instead of one node i'll just make it two immediately price just suit up like anything it just double we don't need it so let's select one and let me create a big table instance so you can have a look at within a matter of 10 seconds our instance is ready and that is the advantage of google in-house product develop because google already knows that what is ins and outs of those products how to make it fast and everything all right so currently there is nothing inside it we don't have any tables so let me create one table and some column family let's say we are going to create a table like a student or let's say employee many times we did it student now here the data is organized in form of column families because data stored in a columnar format not in a row-based format so you can combine multiple column into a column family let's say for employee there is a one column family name we want to provide like a personal info let me go ahead with a newer collect garbage let me add new family let's say it will be professional info so i'll just write prof underscore info now let me select now version based policy maximum version let's say 2 so what it says that you never collect garbage so that means when you update don't delete earlier one now in case of version based policy let's say you update the two version of it so maximum latest version only you keep it remaining you simply garbage collected and let me create it so you can have a look at our table has been created who is having now total two column family now inside this two column family you can put all your data apart from that you can do all those monitoring from here there are some key visualizer so that is a little advanced concept in any horizontal database scaling part all right everyone so that is how you can quickly get started with a big table and put your data now we are not going to detail about it and if you are not going to use this one i would highly suggest you please delete this instance as this the case goes so i am just going very fast all right everyone so that is about the cloud make table google cloud bigquery so it's a data warehousing solution available inside the google cloud where you can process a better byte scale data when you want to let's say apply a very complex query one can use the data warehousing solution inside the google cloud platform which is nothing but a cloud bigquery so let me go to console and let me show you how you can quickly get started with that so currently i am inside my google cloud credit scores project let me go to big data category and inside the big data we have a big query now recently google has updated bigquery ui so i don't want to continue with this new one so i'm just simply going to disable the editor tab it has its own advantage but it will be stable in nearby future for time being let's continue with the old ui and let's fire some query all right everyone so this is a big query ui here you can write all your queries something like a sql query only and on our left hand side you can see we have projects so this is our project id psychic trainer and that is automatically being given by google so inside this project once i select it let me create a new data set all those tables you can create inside the data set only so let's say demo ds okay hyphen is not allowed so let's just keep it underscore database location let's continue with the us one google manage encryption key and do you want to do some table expiration that means after let's say if i keep five days that means after five days of table creation automatically tables will be deleted or expired let me create a data set so we have created one data set now inside this data set you can create tables so to create table you can click upon this create table now table you can create from multiple ways like you want to create an empty table you want to create table from google cloud storage somewhere some file exists you want to upload some local file for table creation you can fetch data from google drive or a cloud back table let's create some ambit let's create some empty table let me provide the table name let's say tv underscore one you can provide some schema so let me add one field let's say name string is good enough age integer instead of tb1 let's keep it here student only and that will be meaningful okay and apart from that you can have a look at a lot of other data types being supported let me create a table so we have created an empty table having a two field name and age now just like insert query you can insert this data and you can fetch data so let me try with this let's say if i just try to do select star from demods dot student you can have a look at this query will process zero bytes of data of android because we haven't inserted any data if i just try to run it definitely i'll get a zero result because there is nothing the query written a zero result now there are lot of public data sets are available inside the google cloud so if you want to add an experiment with those public data set you can explore this public data set from here or you can pin a new project like bigquery public data so it will list down all the public data sets are available like you can see huge amount of data sets are available let me query quickly one of the data set let's say any random one fda food ft food events so in preview you will be able to see what are the data is available now let's say if you want to query this four events how you can do it so that will be a big query public data fda food fda events now i'm not going to continue with the star you can see when you put the star automatically we got here in dry run what it says that it is going to process 17.70 mb of data more data you are going to process more it is going to cost you so let me run 17.7 mb that is not a very big number all right so we got output so this way you can experiment with a different public data set inside the google cloud bigquery google cloud pops up so that is for the asynchronous communication so when you want to let's say decouple two different application and you want asynchronous communication between those two applications one can use the cloud pops up now you can consider cloud pops up is equivalent to the open source version apache kafka where you can create a topic temporary you can put your data as a storage inside some kind of topic as a queue and later on subscription can consume it so let's see how we can create a topic how we can create a subscription for this topic and let's push and consume those data respectively inside the topic and subscription so let me go to my cloud console and let's go to pops up now pops up is a part of big data category yeah we have a pops up and currently the api for this pops up is already enabled so let me create my first topic let's say topic name will give like a quiz underscore topic and it says that do you want to add some default subscription no we don't want to edit we are going to create our own subscription so this is the fully qualified name for your topic let me create a topic with all other default options all right so topic has been created now i'm just going to open this subscription in a next another tab and we are going to create a subscription for this particular topic let me create a new subscription let's say sub underscore one for which cloud pops up topic you want to create so we have just one topic is currently available you want delivery full type or a push type pull that means from the subscription you are manually going to pull the message which is inside this topic post that means you need to provide some endpoint url where this particular topic is going to push all your messages from where you can consume it we want this full delivery what is the message retention duration let's say two days is good enough within a two days we now we are gonna process it expiration period for your subscription so 31 days so if your subscription is inactive for 31 days it is going to expire it all right let's create our subscription all right so subscription has been added next thing is from the topic we are going to push some message so if you go to messages we can publish some message let's say one message i want to publish it like hello one let me publish it let me publish one more message let's say hi buy anything so there are two messages we have published how we can consume it so we can go to subscription we can view this message now same functionality google given us here also so we can select some particular topic and we can pull all those messages same ui is available here also so we are going to pull it and you can have a look at both the messages appear but you just cannot acknowledge it for acknowledgement you need to enable this one and pull it and acknowledge both of them within a matter of 10 second and now if you just try to pull it you won't be able to see this message no message will be available here that means once you acknowledge it you send acknowledgement to the topic that i have processed this messages now you can just simply delete it so now there is no message available inside our input topic all right everyone so that is how easy to go at a global scale without configuring any kind of server and put all your data inside the google cloud pops up for asynchronous communication google cloud data prop so it's a big data solution available inside the google cloud platform where you can just simply easily lift and shift all your existing big data jobs like apache spark job hadoop job and put it into a google cloud environment and definitely you can take advantage of google infrastructure scale so let's see how we can quickly submit one apache spark job inside the google cloud data block let me go to cloud data block so first ever thing is we need to create a cluster so data block api is not enabled let me enable it so manage all your hadoop based cluster and a job on google cloud platform so api is enabled let's create our cluster so i am going with a very simple one like a one master zero worker so in a single machine everything will be available configuration for node so we require two virtual cpu 7.5 gigs of ram good enough no 500 gb business 50 gb is good enough no extra ssd any extra customization so you want network okay nothing more than that remaining all default option we can start creating our cluster so cluster i'll just write like for the crease course okay and location us central one so master and worker both will be created on a single node as we are using it for the learning purpose now creating cluster will take little time and again it has redirected us to cloud data block api and that's some issue so let me go to data proc again and let's check whether it has started creating our cluster or not yeah so cluster provisioning has been started once the cluster will be ready i'll be back oops still let's keep redirecting us to [Music] this particular page not sure about it yeah it's doing so cluster provisioning will take little time and i'll get back to you once the cluster will be ready later we'll submit our apache spark job inside this cluster all right everyone so it almost took four to five minutes to create our cluster now our cluster is ready and running so let's just quickly create a new job so let me submit a job let's just keep all other option default cluster so just now we have created lot of different job type you can submit it we are going to continue with the spark job apart from that you can submit price park job spark hadoop job hyu spark sequel and peak okay which particular job we are gonna submit so along with the apache spark open source project there is already one example has been given which is going to calculate the value of pi based on some iterative matter now we are not going to detail about those algorithm but i want to show you that how quickly you can submit your job so for that we require one jar file where the source code has been written and the source code for this jar file you can give it multiple ways like you can give a file from your local machine now local machine doesn't mean that my own local machine but your cluster local machine so in our case our jar file where all the source code has been resized that is already situated inside this example spark example dot jar so i can try to give the reference like this oops that is not a jar that is this one and a main class inside it so which particular main class you want to execute while submitting job okay so inside this spark example.jar this is the class we want to execute and which is going to calculate the value of pi now additional argument now this is something very much specific to individual example in this particular case we are going to calculate the value of pi based on number of iteration let's say based on thousand iteration you created let's submit our job and let's see how quickly it can create or calculate the value of pi for us so let's just wait here and let's check the output value of pi yes it started running going fine so far let me refresh the logs from somewhere pending output it's streaming yeah you can have a look at we got pi is roughly this one so based on thousand iteration with some iterative method of unit circle they have internally calculated this value of 5 all right everyone so job has been successfully done you can see the status is succeeded so this way you can easily lift and shift all your existing hadoop and spark job inside the google cloud platform now i don't need this cluster so i'm just going to select it and let me delete all right everyone so that is all about the google cloud data block
Info
Channel: TechCode
Views: 133
Rating: undefined out of 5
Keywords: techcode, morioh, codequs, google cloud platform tutorial for beginners, what is google cloud platform, google cloud platform, google cloud platform tutorial, google cloud platform for beginners, google cloud services, google cloud platform services, google cloud, gcp tutorial for beginners, gcp tutorial, gcp services, gcp for beginners, google cloud storage, google cloud networking, google cloud database tutorial, google cloud database, google cloud console, cloud console, cloud
Id: Ncy7c9FP68g
Channel Id: undefined
Length: 112min 5sec (6725 seconds)
Published: Wed Nov 03 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.