Hey guys, welcome to this session on
Microsoft Azure. Let us start off with this session with a fun fact: 80% of the
world's fortune 500 companies run on Microsoft Azure either completely or
up to an extent, and also Microsoft claims that Azure is growing at a rate of
120,000 new customers every month. In this session, we'll be learning Microsoft
Azure and Azure certifications end to end. Before moving on with this
session, please subscribe to our channel so that you don't miss our upcoming
videos. Also, if you want to become a certified Azure Professional, Intellipaat offers a lot of courses on Azure and you can check those details in the
description. Right now, let us take a quick glance at the agenda. We'll start
off with learning what exactly Microsoft Azure is, and after that, we'll be
learning what is the core architecture which Microsoft Azure runs on, and also
concepts like subscriptions and resources. After that, we'll be learning
the core services provided by Microsoft Azure. After learning these core services,
we'll be putting them into use by architecting an entire application using
Microsoft Azure, and a quiz has been prepared for you guys to
recap what was taught in this session, and also please feel free to come on
those answers below. After that, for people who are in confusion to choose
the apt certification course, we have put together a video which shows you what
are the different types of certifications available for Azure and
also how you can crack them. After that, we'll be looking individually into
various certification courses like AZ 103, AZ 203, and also Azure certification 300, 301. So guys, without any further ado, let us move on to the
session. In terms of Microsoft products, like you and I both
know that at some point in our lives in IT or at home, we would have used a
Microsoft product, that's how well they have designed their products that each and every person
on this planet if they are into IT, they must have used Microsoft products at one
point in their life, and Microsoft Azure gives you a benefit in terms
of using Microsoft products. If you are on cloud and if you want
to use any Microsoft product like Windows or SQL for that matter, AWS whatever price AWS gives, Azure can give you a price which is five times
cheaper than that price. The reason for that is very simple, that Microsoft owns
the licenses and hence a license which is basically being bought by Amazon
would be costlier to them and when they give you that license on
shared basis that is on a timely basis, obviously they will charge you more, but
when it comes to Microsoft they have those charges in hand because it's their own license. So what they tell you is that whatever AWS is
charging you for that particular instance for a software
which is by Microsoft which is running on that server I'm going to give you
that same server in one fifth of the price. So this is the first advantage
that you get with Microsoft Azure that if you are into Microsoft products if your
application is using any kind of Microsoft product, you will find it one
fifth of the price when compared to AWS. The second thing is that if you have
Microsoft licenses: If you are, let's say in a business, and you are
using some Microsoft software, you obviously would have bought some
licenses and when you want to go onto cloud you know you do not want to pay
for the licenses as well you just won't pay for let's say the server that you
are using why should you pay for the operating system when you are actually
paid for that operating system before when you purchase it for your office
laptop or your office server, so what you can do is or what option Microsoft
Azure gives you is bring your own license option, so you bring your own
license, you get the license key to them and they will not charge you anything
for their licensing even the one-fifth cost that they were charging, they will not
charge you that if you have your own license if you have your own Microsoft
license and you bring that to the Microsoft Azure cloud. So that is the second advantage that you get with Microsoft Azure, and a third advantage
that you get is more than 95% of Fortune 500 companies are actually using
Microsoft Azure now that could be either in the case of a primary cloud provider
or could be using Microsoft Azure as a secondary cloud provider or it could be
using some products of Microsoft Azure, but if we talk about the
Fortune 500 companies, 95% of them are using Microsoft Azure.
So if you get yourself certified in Microsoft Azure, you have a 95 percent
probability of hitting a company, i.e., if you go to 100 companies you have a 95
percent chance that you will be actually landing up in a company which will be
using some kind of a Microsoft Azure service. So that's a pretty huge
number and that actually gives you higher probability of getting a job as
well in these fortune 500 companies. So guys, these are the three main advantages as
to why Microsoft Azure is being preferred by businesses for their work
and that's what companies have now started to shift to Microsoft Azure as
well and that is why it is seeing such a huge exponential growth rate when
compared to AWS. Now that you know why are we learning
Microsoft Azure. The reason for that is obviously the chances of getting
yourself a job in Microsoft Azure as well. Okay guys, a quick info: If you want
to become a certified Microsoft Azure professional, Intellipaat provides a
master training in Microsoft Azure Certification. They provide AZ 103, AZ 203, and also Azure 300, 301 in the same course. Also, you will be
learning all the major concepts required to crack all these certifications, and
this makes you a complete Microsoft Azure professional. For further
details about this course, check out the details in the description.
Let us continue with the session. Now let's move on and talk about
what exactly Microsoft Azure is. So guys Microsoft Azure is basically a
cloud service which is developed by Microsoft. It is owned and managed
by Microsoft and it offers you its services in a pay-as-you-go model, and it
is a second largest cloud provider in the world. So this is what
Microsoft Azure is. So whatever I've taught you with regard to the concepts
of cloud, all of them are given to you by Microsoft Azure along with the
advantages that I told you, all that you get with Microsoft Azure and all you pay
for that particular service is based on the number of minutes that you're using
its services for. that's how amazing cloud is and that's how amazing even
Microsoft Azure is. Now let's move forward and talk about how Microsoft
Azure basically works. So we have discussed enough about what cloud
computing is, what is Microsoft Azure Are you all clear with it: why we
are using it, why we are learning it today. Now let's get into
the technical aspects of Microsoft Azure and understand how
Microsoft Azure is structured in terms of its architecture. Now
let's go ahead and understand the Azure core architecture. This is the
core architecture for Azure. As you can see, there are four ways of accessing
Azure. These four components are basically nothing but four different
ways of accessing the Azure resources. So, the Azure resource manager is
basically a mediator between the Azure resources and the external agents which
can interact with the Azure resources. Now to interact with the Azure resource
manager, you need these four ways. Now what are these four ways? The first way
is the Azure portal. Now what is the Azure portal? The Azure portal is nothing
but the GUI website that you get provided with. For example, if you
go to portal.azure.com, once you have signed up on Azure, you can
basically go to that portal and you can deploy any kind of resource that you
want. So, you can control and manage the
resources from that interface. So that interface is called the Azure portal. Next thing is Azure PowerShell. Now what is Azure PowerShell? There is a thing called Microsoft PowerShell. So that exists in a Windows
system. So if you're using a Windows system, you can just type in start bar
PowerShell and you would be able to see what PowerShell is. It is a
native Microsoft product. Now what Azure or what Microsoft
provide you with is it also gives you an extension for making your PowerShell
interact with Microsoft Azure, and once you install that extension, once
it's installed on your system, your PowerShell would be able to connect to
Microsoft Azure which will basically give you the ability of controlling or
deploying resources on Microsoft Azure using the command line. So that's Azure
PowerShell. Similarly, you have something called Azure CLI. PowerShell,
like I said, is a product from Microsoft. Azure CLI is basically done on the DOS
prompt. So, DOS is altogether a different software which has been there
since ages on Windows. So, if you are more inclined toward
using the DOS prompt, you can actually install the necessary software for Azure
and then you can use the DOS commands to use or manage or deploy Azure
resources, so this is a third way of accessing your Azure resources, and
then you have something called as Rest clients. Now what are Rest clients? Rest clients are nothing but APIs which you can include in your web applications,
for example, you can create your own web application through which you
can create or use your Azure account. For example, let's say you create a
website wherein you take the user input, you take an integer number
from the user, and then you have created a button which is called deploy and what
that does is whatever number you have mentioned it basically deploys that many
number of servers on Microsoft Azure. But you are not basically doing
that directly from Azure portal, you're not doing it that
with PowerShell, or you're not doing that through Azure CLI. What you have done
is you have created your own application and you have linked your application to
your Azure account and you're making or you're making that website or you're
allowing that website basically to control your Azure resources, and this
is possible using APS which are also called Rest clients. Now these
resources in case of Azure PowerShell and azure CLI would interact with the
help of SDKs to connect to Azure resource manager your Azure portal, and
Rest clients can directly interact with the Azure resource manager and what is
an Azure resource manager? Like I previously stated, it's nothing but a
mediator, so basically what happens is all the resources which are deployed on
Azure are not directly accessible by the user even if he is
using the website or using the PowerShell or using the CLI or using the
Rest clients. Every request has to go to the Azure resource manager and what the
Azure resource manager does is, it basically authenticates your request, it
will check whether you as a user of Microsoft Azure account have the
necessary permissions to do a certain tasks which you're trying to do with
your application. For example, when you create a Microsoft Azure account, you
will have the option of creating multiple users in that same account so
that multiple people can work on your own account. For example, let's say you
open a startup, and now you have hired a tech team.
Now what do you want to do is you want your CTO, or the person who's going to be
managing your tech team, the admin access of your Azure account, and on the other
hand, the developers who probably, let's say, if one of the developers is going to
work on the Azure storage. So you just want him to access Azure
storage you don't want them to access Azure VMs or any other kind of Azure
service. So what you can do is you can create a user and you can give definite
permissions to what he can access and what he cannot
access. Now how will this be helpful. Now let's say your technical manager
signs in as a user, and he signs in let's say through the Azure portal, and
then he tries to deploy a VM resource or was a machine or he tries to use it as you would be able to do so, why because he has the administrator access
you have given him and he'd be able to do everything that you can do. On the
other hand, let's say now your developer comes in. He also logs in through the
Azure portal he also sees everything which is there, but the moment he
tries to deploy a virtual machine because all his
requests would be going through the Azure resource manager, when the Azure
resource manager authenticates this guy and sees whether he has the permissions
to launch a virtual machine or not. So, basically the authentication fails so
the Authenticator will basically say no this guy does not have that permission.
So, Azure resource manager will not take a request forward. It will
basically revert you back with the error message saying you do not have the
sufficient permissions to carry out this task and this is the sole reason
we need a mediator between our Azure resources and the various ways
through which we can use Azure. Now this would be in the same case with
Azure PowerShell, this would be the same case with the Azure CLI, when he uses your
PowerShell with your Azure account you obviously have to enter your credentials.
In case of a developer he will be entering his credentials that you have
given to him, and those basically just have the access to the storage and
again you would also get the same kind of message in Azure PowerShell. Similarly,
you will get the same kind of message in Azure CLI and even in the Rest clients.
So when you connect your website to your Azure account again you
will get a key and a secret access key In other case or what you'll
have to do is you'll have to include some metadata on your server on
which you are hosting that website and that metadata would basically
authenticate you to the Azure account. So for every user, you get a different kind
of credentials for your metadata. In case of a developer, you will get a
different set of credentials, and those credentials would basically when they'll
be authenticated they would get or they will inherit the same properties or the
same permissions that the developer account has, and accordingly your Azure resource manager will authenticate and check whether you do you do you have the
required permissions or not right and accordingly you would be able to do the
operations so that is summing up or or just revisiting what we just studied so
there are basically four ways to access Azure
you have as your portal as your partial as your CLI and you have less lines
write all of these they interact through the observer resource manager and
there's your resource manager based on what kind of permissions you have it
basically routes your request to the necessary service and carries out the
tasks for you right so this is a gist of how as your works in the backend now
let's move forward and talk about the front-end middleware and the services
right so the front-end as we discussed there were four ways of accessing as
your services which were this is the azure portal this is your PowerShell
this is your CLI and then you have the rest lines the middleware was nothing
but it was use your resource manager we'd be discussing that and then you
have the services that we'll be discussing in a few moments right okay
guys a quick info if you want to become a certified Microsoft Visio professional
in telepods provides a master training in Microsoft Azure certification they
provide a is at one or three is a two not three and also as your three hundred
three not one in the same course also you'll be learning all the major
concepts required to crack all these certifications and this makes you a
complete microsoft azure professional and for further details about this
course check out the details in the description so let us continue with the
session so let's start off with the front-end part of the front-end
component so what is your portal so guys the azure portal looks something like
this right it's a GUI component that you get on
your web browser and on your web browser you can basically you can see the on the
left hand side you have things as all resources you have
web apps you have SQL databases you can just select the service that you want
and you can subsequently launched it will be discussing more on this as you
move along I'd be showing you guys the portal I'd be showing you guys how you
can use it to deploy resources for now is the Sun designing what the other
portal looks like and this is what it looks like right on the right hand side
you get all kind of widgets so these are visits these are called widgets which
show different enough information right you'll have the option of including
these visits on a dashboard more on this we will talk as we move along in the
session so this was the azure portal this is the Azure PowerShell so I told
you guys Microsoft has launched its own command line which it now comes with
Windows integrated which is called PowerShell if you want to use that power
shell with your Azure account what is afterwards you'll have to
download some libraries which will help your PowerShell to basically connect to
and sure and once you do that you would basically see something like this
Windows Azure PowerShell and now you can enter the post PowerShell commands which
would basically help you to connect to your account and carry out whatever you
want in a command line fashion right similarly if you are into dos you can
actually go ahead and use as your CLI so dos commands are different from
powershell and there could be a scenario wherein you're better at dos commands
than your PowerShell commands and if that is the case you can actually go
ahead and just install the libraries for us your CLI and then you can actually
use it in the DOS environment right in this case also if you want to a manager
as your resources like you were managing it through Power Cells through command
line statements you can do a similar kind of way in as your CLI as well but
the commands that you would be entering over here they'd be more in line with
dos commands on the other hand when we talk about PowerShell we would be using
PowerShell commands to control as you're in that case ok so this was a 0 CLI guys
in the nest and the last topic is rest clients so rest lines as I already told
these are nothing but API is that you use on your
so you'll have to authenticate them using the metadata or credentials and
these credentials would be available for all the accounts that are created under
root Microsoft Azure account so each credential would be different
from each other based on if there are more users for
example in our case we created a user for our IT manager and then we created a
user for a software developer for a software developer we just giving to
storage access for our IT manager we basically gave him the administrator
access so when they put their credentials down the code of their web
application and they try to connect to Azure
based on the permissions that they have they would be able to access the
resources in case they do not have the permission you will get a corresponding
error to that thing this particular user does not have the necessary permissions
to access that particular issue or against so these were the front-end
components of the azure core architecture now let's talk about the
middleware the component which basically acts as a mediator between your touch
points that is your front-end components and the core components which is the
azure resources so it's acts as a mediator the as your resource manager so
as your resource manager it basically helps you to deploy and manage the azure
resources right it also helps you to organize resources in a cell for example
it will help you in grouping resources together let's say I'm launching a web
application okay and I'm also at the same time launching some other
application let's say I'm also launching the my Android app back-end service so
what I can do is let's say if I'm launching a web application I would need
a certain components I would need let's say I need a separate storage from a web
application right so I can create a storage account I would create a web app
and then I would create a database so now if I create it just like that in
Azure and you try to see all the resources you would not be able to make
out as to which resources are there for my android
and which resources are there for my web application what you can do is probably
you can define nomenclature and then you can say probably you can name your
instances like let's say for Android database you can say Android underscore
database for your web application you can save web app and the score database
so that's one way of sorting it out but as your resource manager makes it easy
for you how does it make it easy it basically gives you the ability to
create groups right so if you're creating one kind of an application what
you can do is you can create a group and inside that group you can basically just
map the resources or launch the resources that are specifically there
for that particular application for example for a web application I can
launch all the resources that I want and I can group them together in the web
application group right I can name the group anything it's just like a folder
guys if you were to understand it more simply understand it's just for sorting
out resources let's say you have a deck store full of files and those files the
PDF txt and doc documents so what you can do is you can just create three
folders and inside those three folders you can put your dog folders in the
inside the PDF folder you can put PDF files and set the txt files you can put
the txt files right so this is the way of sorting on a desktop similarly if you
want to sort your resources in a jar you can use the ability of creating resource
groups using the as your resource manager right and at the same time the
third point or the third feature of as a resource manager is it or tend Achatz
it checks what all permissions you have and based on that it gives you the
ability to control your resources right so it also authenticates your calls to
the azure resources and only through the azure resource manager would your calls
go and if they are accepted by the user resource manager only then they will be
forwarded otherwise they would be rejected and you would get an
appropriate message for that all right so where this was all about the azure
resource manager so I was talking to you about what a resource group is so as you
can see let's say this is a resource group of mind
and what I want to do is I want to group all these three resources so my VM
instance that I launched that is termed as a resource if I launch a web app that
is also termed as a resource don't worry if you don't understand what is the
virtual machine what is the web app I would talk more about this as we move
along in the session but for now understand that any service in Azure any
instance that you launched in Israel that is basically a resource right and
when you group one or more resources together or I should say two or more
resources together for a particular thing whichever so I gave you the
analogy of folders if you group them together and that symbolizes something
that basically can be done using resource groups and that is exactly what
is specified in the figure as well right so you have some virtual machines you
have app servers in your SQL database and you have all grouped them together
in a resource group and probably you can call it let's say the production
environment you can create a test environment call the resource group a
test environment and deploy the necessary resources inside that as well
okay guys a quick info if you want to be going to certified Microsoft Visio
professional in telepods provides a master training in Microsoft Azure
certification they provide is at one or three is a two not three and also as
your three hundred three not one in the same course also you will be learning
all the major concepts required to crack all these certifications and this makes
you a complete Microsoft Azure professional and for further details
about this course check out the details in the description so let us continue
with the session so guys with this we come to an end of the core architecture
so by now you understand the nitty gritties of how as your works in the
backend right how things are structured insiders are moving forward guys now we
will be talking about the agile services right so we'll be talking about the
resources that we just saw the resources that we were grouping inside the
resource flow or the resources which basically there's your resource manager
was managing we're going to talk about those and those are nothing but services
in reserve right so it will become more clear as we move along so let's start
off now with core as your services okay so before starting of the services guy
there's again simpler way to understand all the services in is your what is your
has done is let's there are around 300 plus services in Azure right now it is
difficult to actually remember all those services or to basically know how one
service is different from the other so what a juror has done is it has actually
divided its services based on what the services does but what the services do
right so the major sections that it has divided its services in are these right
so these are not all the sections that Azure gives us the services in but these
are all the important sections or the important domains in which as your
provide you services and as a cloud engineer or as a as your engineer when
you'll be working in companies mostly you would be working on services which
are included in these domains right we will be covering these guys there is no
much your engineer out there who would be knowing each and every service in a
jar right but what do we as learners can do is we can learn all the important
services which are basically use in your everyday life when you become a cloud
engineer right and as and when you apply to companies probably you can tell them
you know all the basic services all the important services on Azure and if there
is any special requirement for a particular service that they used it is
obviously going to be an easier task for you since you understand how is your
works so understanding one more service in Azure would not be a great deal but
most of the times I would say 90% of the things that you'll be doing as a cloud
engineer or 90% of the services that you would be using or working on as a cloud
engineer would be covered under these terms so now let's go ahead and start
off with the first domain which is compute but before that let me give you
a brief about all of these domains right so the first domain is compute so in
compute you have compute intensive resources wherein you get RAW processing
paths more on this we will talk about as we move along then the second kind of
domain is networking so this particular domain would include all the services
which provide you with net working capabilities right then we have
the storage domain which basically gives you all the services in Azure which can
basically give you the capability of storing some files right and obviously
there are a lot of sources and stories and each service is targeted at a
different kind of use case so we're going to discuss some of the storage
services in Azure which are very prominent then you have services related
to database and analytics right so if you want to store textual data which
also you want to analyze using graphs or using flowcharts right you can do that
energy or in the domain theory place plus analytics then our next domain
comes out to be AI and machine learning now this is a very important domain guys
and I've seen actually people who are data scientists people who are
researching on data science so basically when you're working on designs or when
you're working with AI you need a lot of computing power right now the good thing
about cloud is that it offers you a pay-as-you-go model which basically
means you only pay for the time you use the resources for right and nobody can
afford you know a high spec machine if you need an i7 and octa-core processor
an hour 32gb ram it's very difficult for a layman to get hold of such kind of a
machine so what researchers and what data scientists can do is they can
actually launch a machine on a jar with a similar spec that they want for the
use case use it for the time they want and shut it off and you'll be charged as
low as around 0.05 or 0-6 dollars I guess for if you use it for around
half an hour or so right so I guess half an hour or 45 minutes should be enough
if you want to do a POC or if you even do resourcing or if you're into testing
I guess this would be the cheapest option for you rather than getting a
full-blown server or full-blown laptop with high specs and all that'll be more
expensive for use it's better if you use services from cloud for your compute
intensive needs but that is not it what you get in a and machine learning domain
is you you know you get a prebuilt dashboard
kind of a thing so you don't have to feed in algorithms
everything is pre-built in Azure you basically get a drag-and-drop kind of a
push functionality when you are particularly using Azure ml because I've
used that I am telling you it's pretty drag-and-drop if you compare it with its
counterparts let's say if you want to do a regression test in R you would
actually have to write a long script to do a particular thing but the the kind
of services that azure has launched for example like I mentioned as your ml all
you have to do is drag and drop and you can get your results right so that's how
convenient it is and that's why people are more and more to save their time to
save on their costs they're actually using the AI and machine learning
capabilities of the cloud and particular we were talking about is also as your
also has a and machine learning services which also we will be discussing as we
move along in the session right then the next domain is identity so identity
domain domain basically includes services which will help you in
authenticating users it will help you to basically get the metadata credentials
that we discussed earlier for your website so all the authentication part
the authorization part whenever you want to give specific permissions to
particular users all of that can be managed with the identity services and
lastly we would be discussing the domain which is management which would
basically include services such as monitoring it would include services
such as infrastructure as code right don't worry about these big words guys
as we move along everything will be clear but without wasting time let's
jump on to our first domain which is compute and let us understand what are
the different services that we're gonna understanding that right so in computers
you basically have four prominent services you have virtual machines you
have function apps you have app service and then you have as your cube in a tea
service so the first service that we're going to
discuss about is the Azure virtual machine service now what is that
nowadays Azure virtual machines and nothing but it's it's basically a server
or I can say in the most layman terms possible let's just say it's a compute
just like a laptop which has just installed windows on it and nothing else
is there right for example let's say you buy a
laptop what is the first thing that you see you see your windows there or you
see a Mac OS there or you see your Linux operating system there and basically no
no no software is and so on it's just a bare minimum operating system which is
given to you and whatever you want to do on your laptop you install it and then
you work on right so similarly virtual machine there there boy so machines are
nothing but the machines that are launched by your cloud provider which in
our case is your and they launched it for you so you don't have to worry about
where are they getting it from they launch it for you and what they do is
they give you the remote access to it right so it's basically like working on
your own laptop but working on it from a remote location so similarly you get
your own machine on Azure which is freshly made right you have a fresh
operating system on it there are no software is installed and whatever you
want you can install on this particular server so we call it server because
basically you'd be using it for deploying applications which would be
basically be available to the user so that's why we require a server right so
basically your server all computers that are launched by us your for your own
personal use but you would be accessing it remotely if it's a Windows machine in
a GUI case you can basically do an RDP and use it if it's a Linux machine you
can do basically do an SSH and you can work on the command line for it but
remember there are no software installer and there's no extra software installed
on it so anything that you want the server to do for example let's say you
want to create a database server you can ingest install database on it and we'll
be ready right and it will be ready for accepting requests for the database
service if you want it to be a web server you can basically install Apache
PHP etc and then you can put your website on it it'll become a web so
anything and everything that you want or that you can do on your laptop you can
do on ports and machines so this is what Azure virtual machines is guys now the
next kind of service is similar to virtual machines but it's
actually an advanced version of virtual machine so what our function apps so
those function apps like I said it's an advanced version of virtual machine in
which you do not get the access of the operating system okay so let me give you
an example for example let's say you know you are let's say doing some work
on a website and that website basically let's say it does processing for you for
example let's say you are on Facebook and what you do is you upload a profile
picture and once you upload a profile picture let's say you want to crop that
profile picture to a particular size and then you want to upload it as a image on
your Facebook so what will happen so you will upload your image on the Facebook
app you will do the cropping part that you want to add a filter as where you
add that filter and then you click on save and then you want it to be uploaded
as a profile picture on your Facebook right so there is a minimum amount of
processing that would have to be done on your image some processing has to be
done on your image right and basically how that works is that processing is
actually not done on your phone or on your computer on which you're accessing
Facebook that is actually done on the Facebook servers but the way we handle
it is they basically would have a web server in place they would separately
have a database server in place which would be interacting with your web
server and for the processing they have a separate server ok and that separate
server is basically called up back in server and what is that back-end server
do all the requests which comes in it basically processes it and gives out the
result that's the sole job of a back-end so and why is it separate why is it not
integrated with the web server the simple reason for that is let's say
there are hundred users who are using your website right and they're
constantly exploring a website going through the UI doing uploading images
doing processing and everything now your back-end component is doing the image
processing and the front-end component of yours which is the website component
of yours is basically serving the website to the users right so there are
two happening over there and let's say any
one of them they get overloaded with some work right that would hamper the
performance of the second component let's say if there are more users on the
website right so obviously your web component or the website component of
your server would require more processing power so let's say it takes
up 60% of the processing and let's see what happens is all these people who
came in they upload their profile pictures and they put some filter on and
they now want to process it so one thing is your server is already busy serving
these hundred people your website and the CPU mark is at 60% now because of
the processing load which is coming on to your back-end component that also
requires around 80% CPU to work efficiently but now because 60% has
already been taken by your website server there's only 40% left so what
will happen in this case your server will inadvertently become slow right
it'll become slow and it it might even crash in case a deadlock situation
happens right and this was usually the case when we even not aware of
distributed computing but nowadays what happens is we are dealing in a world
where we cannot afford downtime and that is the reason everything has been
distributed right so if there is more load on my back end server my back-end
server is going to handle it it will not hamper the performance of my website
server or my database server so every software component is now distributed
and why are we discussing this we are discussing this because we are talking
about function app now what is the function app function app is nothing but
a back-end service on which you do not have to deal with the operating system
like you do in virtual machines you do not get an access to the operating
system you cannot install any kind of software on it whatever you choose let's
say you have an option of choosing what kind of back-end code you want to run on
the function app let's say your image processing whatever code you have
written for the image processing app let's see have written it in Python so
when you will be deploying a function app what we'll be choosing is what kind
code would I be uploading on the function app let's say I choose Python
the next thing that you would be doing is you'll be clicking on next and then
it will make your function app ready and then he'll ask you for the code for the
image processing you given the code you click on save and now you're back in
server is ready to accept requests that's it
so now all your website server has to do is being the backend server with the
required information with the image it will basically take the image it will
run through the code see what it has to be done do the processing on the image
and give the image back to the website server or the Facebook servers which is
basically upload the profile picture right so in ER just what does function
app a function app is basically a service which will not give you access
to the operating system it will basically give you one dashboard on
which you can upload code and it can do any task you all you have to do is give
it to work it will do for you and that's about it you don't have mess around with
anything else you don't have to worry about what machine says I should put my
function app you don't have to worry about how many machine should I run so
that my function app never gets overloaded everything is managed by us
you're all you have to do is given the code and see your application working so
these kind of applications are basically called platform as a service and the
earlier service that we discussed was virtual machine is basically called
infrastructure as a service why infrastructure as a service because in
that case you've got access to the operating system right you can do
anything on that server you can store anything you can actually delete
everything also and you can make that server hang or you can make that server
not working you can actually uninstall the installation files or the operating
system as well that is something that you can do with your server but what
will happen is it will not cause any harm to us you're basically you would
not be able to use this over in in the far end time you'll basically have to
delete it because it won't be usable that's the most extreme thing which
could happen but the case with function app is because it's a platform of which
it's giving you a dashboard on which you can upload code hence it's a platform
right it is not giving you access to the operating system it is not giving you
access to what software you can install you just choose the environment in which
your code can run and that's it that's what a function app
is all about it's a back-end server without tooling or worrying about the
underlying infrastructure moving forward the next service that we have is
absolute absolute is yet another platform as a service kind of resource
that we have in Azure and what you can do with app services you can basically
launch or deploy websites now you would be wondering that in function I Paul so
I can just give my code and probably I can give my website code and we'll
deploy the website for me no your function app can only give outputs based
on the inputs it cannot deploy a web application for you if you want to
deploy a web application you'd have to use app service in app service you would
find a resource called web app we'll have to deploy that and on that you will
basically get again a - hood since it's a platform SSO which you will again get
a dashboard where you will be able to upload your website files and once you
do that once the process is finished you will basically get a link and if you
click on that link you can see your website right so again we did not get
access to the operating system we do not have control on what softwares are
installed the basic way the app service works is or in our case with since you
are discussing board websites our web app works is it will ask you what kind
of code is your website written in let's say my website is written in node.js so
I will just say I'll just select no js' and I'll click on next and the next
question will ask is do you want to auto scale when the CPU increases or when the
memory is low do you wanna also Oracle you will say yes I'm gonna order scale
you click on next and it does everything for you and that what at the end what it
get is again a web UI on which you will have a button called upload on which you
just have to upload your website and everything would be set for you right
there is also an option to just directly mention the github link it will pull the
code from github and deploy whatever code was there
get up but the key word here is it deploys application for you in function
app so this is a very tricky thing that what is the difference between function
app and a web app so function app only does the backend tasks for you you give
it code it had taken input it will give you the output that's it it is not used
to deploy an application app service on the other hand it is only used to deploy
an application okay so this is the main difference between a function app and an
app Service a next service is probably the very important service when you're
working in a containerization environment so I'm sure most of you know
about what docker is if you don't know just type on YouTube docker tutorial by
Intel apart go through that video and you would be able to understand you know
what docker is so docker is nothing but a containerization platform on which you
can deploy application so we were talking about distributed computing
right so containers are nothing but they act as separate virtual entities which
are isolated from each other so I can launch an ubuntu container I can launch
a center waste container I can launch a different flavor of you Linux container
as well right in them I can install any software I want then exactly whatever
code files I want I can put on those containers and these containers can then
interact and may make a distributed application kind of a scenario right now
what Kuban it is is it manages these containers view it's an automatic
service which manages these containers what what does it manage let's say you
know I deploy three containers one is my website container one is my back in
container and one has been database container now for some reason my back in
container is not working ok it stops working now what will happen what should
ideally happen ideally my if my back-end server stops working I need to get an
alert I will go there I will see what the problem is I will fix it and I
my back in container or back-end server will again be ready in my application
would serve normal but what happens is you cannot monitor application 24 by 7
so what Cuba nineties does for you is it does all the manual tasks for you so it
automatically detects that a fault has occurred in a particular container and
what it does is it deletes that container and launches a new copy of it
automatically right and this is just one of the tasks that Cuba Nettie's does
automatically for you you can also configure communities to scale your
containers for you or descale your containers for you and it can do a host
of other things and we would disperse it probably when we move along and we just
when we just are focusing on SEO communities for service we can actually
talk more about it because of a loads of feature in it but this is again a very
important service so most of the startup companies are not not MNCs but most of
the startup companies have now adopted the container architecture or have now
made their code in to fit into containers and now they are using the
community service if I were to give you a little background of the Cuban IT
service the Cuban IT Service was actually developed by the Google company
later they made it open-source and now it's available to the world anybody can
use it or download for free and can actually install it on their system
right but when you install it on your system you actually have to manage Cuba
native all by yourself so what does your does is it has created platform as a
service again and it says I will handle the Cuban ities installation I will
handle the Cuban --'tis configuration you just tell me what you want and I
will deploy that in a Cuban --'tis cluster for you right so all the
advantages of Cuban ities you get and you also don't have to deal with Cuban
at ease so this is the power of cloud this is what as your is giving you as a
service right and this is what as your Cuban 80 service was all about ok guys a
quick info if you want to be going to certified Microsoft Azure professional
in telepods provides a master training in Microsoft Azure certification they
provide is at 1 or 3 is a TuneIn 3 and also as your 303 not one in the same
course also you will be learning all the major cons
it's required to crack all these certifications and this makes you a
complete microsoft visio professional and for further details about this
course check out the details in the description so let us continue with the
session all right guys so now let's get down to the interesting part let's now
start off with a hands-on where you will get an taste of how as basically as your
portal looks like and also how things work out how you deploy these sources on
Azure and basically in the most essence we learned about what a VM is so we're
going to see how we can actually launch a VM inside Microsoft Azure alright so
just give me a minute I'll just switch on to my portal and then we will start
on with the demo now before we were there guys these are the three things
that we're gonna do so when we were learning the architecture I told you
guys that basically all the resources that are deployed on an azure can be
managed or they can be grouped together using resource groups so the first thing
that we're gonna do is we're gonna create a resource group which will have
the name demo environment ok then what we're gonna do is in this demo
environment basically we are going to launch a virtual machine which will have
a Linux operating system on it now once it is deployed we will try to connect to
this virtual machine using the put tea tool that is available for doing SSH on
Linux so basically when you have whenever you're installing or whenever
you're launching a Linux Linux instance a way to connect to a Linux instance is
using SSH right because it's a command line operating system you know
everything in anything that you want to do on this operating system you can do
it on the command line on the contrary if you deployed a Windows machine in
that case you have to use the remote desktop protocol so it's basically
called RDP and there's a software for that which can basically connect using
RDP so that is in built-in windows so if you are on a Windows machine just type
on type remote desktop right and you'll get that application
the only thing it'll ask is the IP address click on connect and then last
for the password alright guys so without wasting any more time let's go
head switch to my Azure portal and let's see how it looks like right so guys this
is my job portal so let me just go to home and let me show you how it actually
looks like so this is how your portal looks like guys this is the vegie way
that we were talking about now there are some resources that are already launched
in my Microsoft Azure right so as you can see I have named my resources
something like this but let's say I am going to deploy a Linux VM run right
that little VM is only for demo purposes now this will actually give you a take
as to how you can or how these source groups basically can help you guys now
if I go to all resources right now you can see these are all the resources
which are deployed on Azure right now but I do not know which resource is
doing what or which resources contributing to what application on the
contrary if I go to resource groups you can see that there is a group which is
called Intel apart which exists on my Microsoft Azure portal and if I go
inside this resource group I would be able to see all the resources which have
been deployed so now what I'm going to do is because what I am doing right now
is just for demo purpose so what I'm gonna do is I'm gonna add a resource
group so let me just click on add and let us create a resource group by the
name demo environment so I will type in demo - env right and let's say the
region that we have to choose always choose the region guys which is nearer
to you so what do you mean by region so region basically means now Microsoft
Azure is so huge that it has deployed or it has set up its data centers in
different countries the reason they have done so is because let's say you have
your application and you have a startup and you have actually started it up in
let's say India right and your application is actually only accessed by
the Indian audience right so that's whom you cater to so what you can do is
rather than deploying so it makes more sense to deploy your application or your
website on in a region which is nearer to your audience because in that case
it's the internet route it will be far less
when you compare it with launching your application in some other country for
example let's say we launched this resource of a create this resource club
and let's say us right but my target audience is basically in India now what
is gonna happen now when my target orders audience is in India and let's
say I'm going to the Intel epic on website that is my application every
time the data is being fetched from the US and it is being transferred to the
Indian servers and then from your ISP you are getting the data on your system
now if we were to make or if you were to deploy our application in India itself
what will happen is that the distance that the data has to be covered at a
blood-covered will be less and that will significantly increase your response
time of the website right so that is why choosing a region is very
important you can choose any region you want but I would suggest you choose a
region which is more nearer to you right so in our case let's scroll down and see
if we have any region which are nearer to me so I can see that we have a region
in a nerve is called South India so let us select that right all right so now
the region that my resource is getting deployed is in South India right so
that's what I have selected the name that I have given is demo environment
and now let's go ahead and click on review plus right so this is what is
gonna happen the subscription is going to be pay-as-you-go if you are on the
free tier you will see free tier the resource group this is the name that I
have given and the region on which this is being deployed is South India let's
click on create so now my resource group is being created right and if I do a
little refresh over here I can see that my demo environment resource group is
present here now what I want to do the next step that I have to do let me just
go back to my slide is to deploy a Linux VM in this demo environment resource
group so let's do that let us come back to our azure portal and
now what we want to do is we want to deploy a virtual machine so you will see
on the left hand side that there is a service called virtual machines let's
select and now what I have to do is I have to
click on and write so because I want to create a virtual machine so I click on
and now it will ask you which resource group do you want to make this server
available in so I want to basically make it available in demo environment so this
is what I have selected what do I want my virtual machine machine name to be
let's say I want a virtual machine name to be new VM - Linux ok this is what the
name is the region I wanted to be nearer to me so I select South in there right
availability ions options if you go ahead and choose it you will have an
option between availability set or no infrastructure redundancy required so
what is redundancy redundancy basically means do you want a copy of your server
somewhere else as well so that lets say in the data center where this server is
going to be deployed that data center goes down right or something happens and
the data gets corrupted so if you choose that you know you want to replicate your
server what will happen is even if there is a downtime in a particular data
center in which your VM was deployed your redundant server will come into
power and that is from where you will get your data so basically it increases
the availability of your application and even if there is a problem from the
azure side your application will not go down but since we are doing a demo what
we will select is no infrastructure redundancy required what is the image
that you want so by image remain the operating system which was what kind of
operating system do you want to connect to right so you have all these operating
systems over here I can actually launch a Windows 10 Pro I can launch a Windows
server I can draw on Ubuntu 16 and on the way 9 Santo is Susie Linux RedHat
Ubuntu Server right so for our sake let's consider what you saw is what we
want to deploy and now it will ask me what is the size of the machine that we
wanna give right so what you can do is you can actually change the size so by
default let's say you should deploy a two CPU and ia GB machine you can just
click on change size and you will have all these up
available to you right so you can see that because what we are doing going to
do is a demo so what you can see over here is 0.5 gb ram and a 1 cpu will
basically cost me around 383 2 peas per month right so let us go ahead and try
to deploy this machine right so this makes more sense because we are only
doing a demo and probably delete it after this session right so I've
selected this and I click on select now so you can see it's 1 V CPU and point 5
GB memory great next thing it is asking me is what is the authentication type
that you want to give on this particular machine you can either choose the ssh
public key or what you can choose is password right so if you want to choose
ssh public key what you'll have to do is you'll have to generate ssh public key
right if you choose password what it will ask us the username and the
password if you and this is basically the most standard way of doing it but if
you want to give security to your instance the suggested method would be
to create a public key now how can you create a public key there is a software
called put t jen right and the way you can download it is just type on google
put the download right and then you will get a link from put t dot org just click
on that and you can say you will see this link which says download putty just
click over here right now you will get all the versions on which you can basic
using which you can pay to basically install putty now what we're basically
looking for is pretty Jen right so you will find it over here based on your CPU
architecture you can basically select put each n and then you can download
once put each n is downloaded it will look something like this so as you can
see it's putting a generator what I want to do is I want to generate a public
private key so let's click on generate now what will ask is you'll have to
hover your mouse here for some randomness and it will create a key
based on that so let's hover our mouse in this region and now it has created
the key for me now it says the be careful pasting into OpenSSH
authorized if I so this is the public key basically right and you will have to
copy this let's copy it right so this is my public key now let's
come back to my portal so I can just choose SSS public key and I can paste
the public key over here and this is how it works now if your key is verified
what you will get is a tick mark over here which basically means that a key
has been verified and now you can go ahead right so my key is now verified
what that is the next step that I'll have to do the next step that I have to
do is I'll have to save the private key okay so this is the public key which has
been generated what I have to do is I have to save the private key which I'll
be using to connect to my instance so in this case you basically do not have to
use you know any password if you have this particular file which will
basically be saved once I click on yes let's say I save it on the desktop
let me actually save it somewhere else let me save it in C Drive and let me
actually create let's say in the app folder I'm just creating this private
key file in its name it has as your - key okay and now I'll click on save so
my PPK has now been saved if I go there I can just go to C I'll go
to app and I can see that as your - key PPK is now pleasant over here great so
this is done now I'll have to specify the username so let's specify the
username as Azure you can specify any username that
you want right so we are specified it has sir
now the next thing that it is asking is do you want any inbound ports to be
enabled on this particular instance now we will be connecting to this instance
from our machine right and the protocol that we will be using is SSH so we will
have to open the SSH put so we will have to click on allow selected ports and
what port do I want to allow I want to allow SSH port okay let's select HTTP
also why I'll tell you that in a few moments once you have the Linux machine
up and running right in case this was a Windows machine I will also have to
enable RDP port which you can just take over
here in my case I don't need the RDP port
I just need SSH and I need HTTP to they use for it I'll tell you as we move
along all right so I won't SSH in HTTP to be enabled and what I can now do is I
can just click on review plus create and now it should basically just show me all
the configuration for my VM which is going by default right I can just review
it and that will be it right now I will show you where is something very
interesting over here will let you once this loading is complete so as you know
cloud computing it doesn't charge you per month but basically the pricing
model is a per our kind of a thing right so as you can see the machine that I'm
launching right now is right now 50 pairs upper this is the rate or this is
basically the pricing of my machine that I'm gonna launch so that's awesome right
the next thing is the OS which is being launched as Ubuntu server 18.04 that is
also great and if you want you can actually look at the username which is
a1 to discern what are the public inbound puts it's SSH and HTTP the this
type is SSD is it a manage this yes the network it's basically creating a new
network for me which is which goes by the same resource group name which is
demo - env so what is what happen is guys when I launched this particular VM
a lot of things will be created it will create disks it will create networks and
everything related to it will actually go inside that particular resource group
right so I will show you how to do that just click on create once you feel
everything is right and now my machine is basically getting created okay so
let's wait for view while it will take like a minute or so to deploy a machine
so let's wait for that time and let's hope this deploys soon so as you can see
it says your deployment is underway once it's complete you will see a different
message over here so let's wait for that message to appear alright guys so my deployment is
complete as you can see the message is being told over here so the next thing
that I'm gonna do is I'm gonna go to resource groups and now I can see that I
have something called as demo - env I'll go inside that and over here I have all
the related resources to my VM okay so guys as you can see my virtual machine
is this particular resource and let me go inside that and now what I have to do
is I'll have to select this IP address this is the IP address on which the
instance is available so let's copy this IP address and now let's open the put T
tool now what is the put a tool we used put ijen before to create the key now
for connecting to the instance you will have to use the putty tool now how do
you download the putty tool again back to your tab where you opened the
download web page now over here you will have to select put T dot exe it says the
SSH and telnet client itself you'll have to download this according to your
architecture and then you will really have a tool which looks something like
this now what I want to do is I want to connect to this IP address which I've
just copied from the azure portal and I will have to go inside SSH and I'll have
to coincide auth and now I will have to select the PPK which will basically be
used to connect to my machine which is as your - key let's click on open and
now everything is ready guys so I have selected the PPK I have
inserted the IP address and now let's click on open so it will ask so whenever
you connect to a new IP address it will give you this morning don't worry about
it just click on yes and now it is asking
what do you want to login and so remember the username that we gave was
assured so let's select that and hit enter
and that's it that's about it guys so now you're inside your Linux machine
which is the Ubuntu machine and now you can do anything with it as you feel
necessary right now you will ask me one thing that you know we deployed we
allowed HTTP connection why did I do that just to show you show you one very
awesome thing now since this is a one two the commands that I'll be entering
now is basically for Ubuntu and if you want to copy these commands you will
also have to install or you will have to launch Ubuntu OS right so what I'm going
to do is I'm gonna make this server as a web server so in order to do that I'll
pass on the commands sudo apt-get so let's update this machine first sudo
apt-get update and this will now update the machine all right so my machine is now updated
now let's go ahead and install Apache so sudo apt-get install Apache to
so guys what is Apache Apache is basically a software for web server and
what it is right now happening is it is being installed on my open two machine
once it gets installed on I want to machine all I have to go do is I'll have
to go to the IP address of my VM instance and then I will be able to see
a web page over there right so it's almost complete guys so let's
wait for this to complete and then we move forward
all right so my apaches has now been successfully installed and now if I go
back to my machine or my portal manager pool I just copy this IP address and now
let's paste it in a new tab and hit enter
can you see you this is basically a website which has been created by Apache
too it's basically a default page which you see when you have installed Apache 2
but nevertheless guys if you guys go to this IP address given that I would not
shut off this machine you will be also able to see this particular page which
is basically the Apache to Ubuntu default page now what I can do is I can
also go ahead and edit this web page right so let's go ahead and go to the
location where this web page is basically stored so it's stored inside
where www.h tml and this is the index dot HTML file which is being shown over
here so what I'll do is I'll just change the name of this index.html file let's
say one dot HTML file and I'll have to give sudo in order to do that and now
let's create one more index dot HTML and let's say this is a HTML page whose
title would be demo website right and the body of this would
basically have a heading which would go like welcome to intellibid s--
as your okay so this is where I specified I
closed the headers and I closed the body and I closed the HTML as well
okay so I've closed the HTML I've closed the body I've given the h1 I've given
the title as demo website let's save it and if anyone LS now I can see that
there is a next index.html present now if I do a refresh you can see I'm
getting this welcome to in telepaths is your training and the title of my page
is demo website right now you would have also noticed I made the Apache page as
one dot HTML so if I go to one dot HTML this is the one to default page and if I
simply go on the IP address it will open in lecture HTML which is basically this
so guys we have successfully deployed a Linux VM on Azure and yes successfully
hosted a website also on this particular so now guys remember that this is
infrastructure-as-a-service now what I've done is I had the access to the
operating system I basically installed Apache 2 and then
I put a website over there in that particular folder which is Val www.h tml
and then I caught the website right so this is infrastructure as a service
you're getting the whole operating system and you can do anything with it I
can also install MySQL on it and I can configure my website to basically talk
to the MySQL server right I can do anything with this so now when you
compare it with app services which we saw earlier that app services are
basically platform as a service or function as a service functions app is
also a platform as a service now how is it different from the hands-on point of
view you would basically not be using putty you will basically just deploy
that app that is the web app and you will get a URL instead of an IP address
you will get a URL and if you go to that URL you will see a default page so let's
let's say you saw a default page when we install apache2 you will see a default
page from azure when you go to that link that this is a
default page for the web app that deployed right and you if you find the
dashboard you will see a button called upload and what you can do is like I
created this index or HTML file you can also create yours and you just have to
upload it now as your will take care where that file has to go inside the
file system right and it will take care that on the link when you go now you
will see your website it will take care with what software is it has to install
whether it wants to install apache2 which is basically the web server or
there are several other web servers as well you have nginx you have you know
Tomcat so anything that you want or anything which is configured in Israel
would be automatically installed and all you will get is a button which is ask
you to upload the website so you would not get direct access to the operating
system and hence it is called platform as a service similarly with functions
are in app services basically been deploying applications in function up
you cannot deploy application it is simply a place very quick just put your
code and that code will be done right so if if in short I will have to tell you
what functions happens it basically does processing ok guys so guys this was a
demo this was the first demo as to how the you can launch virtual machines on
Azure now let's go ahead to our next topic
alright guys the next in our list are the networking services of azure so
let's go ahead and start with them so guys basically there are five core
services in networking that you should know of the first one being the very
important one which is virtual networks now what a virtual networks what your
networks are basically isolated environments or isolated networks on the
azure infrastructure for whatever machines or whatever the VM steady
launch in Azure and if you want them to talk to each other for example I
deployed an application where probably able to have a database I would have
back-end server I would have a front-end server
now all these servers have to interact with each other so that the whole
application can basically function now how can they interact with each other
they can interact with each other only when they are on one network right so
that is very important otherwise they will have to interact with each other
other over the internet but that is not a
six-yard thing to do right it is a cure thing to do is when instances or when
servers interact on their own private network which is basically not
accessible to the outside world it is only accessible to the administrator who
has probably launched those instances so in those cases we need virtual networks
and also one more thing that many are using virtual networks or if you are
using private networks in a jar you basically get a bandwidth of around 1 GB
per second right that is the kind of bandwidth that you get when you are
dealing with instances which are in the same network that is in the same virtual
network but if instances have to interact over the Internet obviously the
bandwidth will go down and at the same time it is not secure right so whenever
you want to launch any resource in Azure which basically can be launched inside a
network they are launched inside a virtual network and you do not have the
option of launching a instance or a server without a virtual network you
have to launch it inside a virtual network now that if you don't want to
use it and want to have instances talk over the Internet that is your choice
but as your does not give you an option of deploying an instance without it
being a part of a virtual networks alright so this this is a virtual
network moving forward now let's talk about load balancers now what are load
balancers now this is a very important service reason being when you deploy
your application on cloud one of the most popular reasons of deploying
applications on Azure is because you can get high availability which basically
means you can launch your application on multiple servers so that even if one
server fails the other server can basically be their replica and they can
serve the application right so this results in a high availability of your
application now what is a load balancer a load balancer basically sits in front
of multiple virtual machines which has the same application running on them
right and it sits in front of it now obviously if you are a customer you do
not realize whether a application has been made redundant on multiple servers
right facebook.com or if you go to Intel about
comm you wouldn't realize how many servers are actually working in the
backend so how is it possible right you have to know basically servers they have
IP addresses right so you as a customer should know which IP address you should
ping but all you do is you go to a particular domain let's say in Telecom
and you get the website so how is it actually working so basically the way it
works is that load balancer has its own IP address or domain name and what the
load balancer does is it basically randomly spreads the request out to the
N number of servers it has let's say my intellibid website resides on five
servers wherein one is the primary server and four are the redundant
servers but all these five servers are basically serving my website now when a
customer comes on to intel upon his request is processed in this fashion but
that whenever he comes in telepods com basically he is routed to the load
balancers domain name or IP address and once a request reaches the load balancer
the load balancer sees what which server basically has less load right and the
server which has less load it sends the request to it all right
now your balancer is also an important concept you must have heard about auto
scaling right or if we talked it in terms of Azure it's basically called VM
scale sets so what happens in that particular scenario is we specify a
threshold limit we specify that whenever the CPU usage goes beyond 80% of the
aggregate number of servers let's say there are four servers and the average
CPU load on all these four servers has gone beyond 80% right so what happens in
that case it will basically launch a new instance now when it launches a new
server with the same application your load balancer should also trout traffic
to it and that is when no balancer plays a key role it plays a key role when you
are doing auto scaling and at the same time it plays a key role when you have
multiple servers which are basically serving the same application and you
want to equally divide the coming onto your servers or basically
coming onto your balancer on your servers right so this is what a load
balancer is as the name suggests right but the load balancer in its most basic
sense it it basically does it randomly right it would randomly equally
distributes the traffic among all the servers it does not find follow any
protocol or rule as in how it has to basically divide the traffic right so
this kind of process or this kind of procedure when it is followed it is
basically called a round robin fashion of distributing traffic's right the next
kind of load balancer is the application gateway load balancer now what does the
application gateway load balancer the application gateway load balancer
basically works a little different from the normal load balancer that we
discussed earlier so in the application gateway load balancer basically the load
is distributed based on rules now what are those rules those rules are
basically paths now for example if I go to in telecom slash blog right what will
happen I get to see the blogging website and if I go to in telepods comm slash
all courses I basically see all the courses which are there on the website
now how does that work now you if you are from an IT background
and if you understand how servers work you might say that there must be two
folders inside the route document of the web of the server where in inside the
all course folder you have the code for displaying all the courses and inside
the blog folder you have the code for showing all the blocks but that is not
actually how it works right so basically if you if I were to talk about our
infrastructure the intel parts infrastructure we have a separate server
for blogs and we have a separate server for all the courses that we have right
now the load balancer that we use is an application gateway load balancer it's
kind of like that so basically what happens is whenever it sees that in the
URL is a path which says slash blog it basically routes the traffic onto the
blogging server and whenever it sees that the path is slash all
forces it routes the traffic to my courses server so this is how look this
is what path based routing and basically it works on the layer seven so if you
guys are aware of the OSI model right so according to the OSI model the
application gets a load balancer works on layer seven right now if you are not
from CS power and I think and if you didn't understand what what I just said
it is okay so in the most basic sense what the applications say the layer
seven that it works on in the most basic basic sense of what that means is that
application gets a load balancer does path based routing right whatever path
is there in the request based on that path it basically routes the request to
a particular server you can also define a default rule in case anything which is
there except slash blog and slash all courses if there is a default server
that you want to send your request to that is also possible
like for example you would have seen that some websites if the link is not
found on that website they basically route to a 4:04 page right now how does
the typical 4:05 for page look like right if you look a 404 page it
basically says error for zero for content not found right but some
websites like if you if you would have visit flipkart.com or amazon.com or even
in telepods dot-com in that case if you try to search for something which is not
there on the blog for example let me just switch to my browser right and let
us go to let's say and telepods calm and let's say slash as your right now what
will happen it'll basically give me a 404 page and let's see how that looks
like okay so there is a slash is your thing let me do one thing let me type in
some gibberish content then let me just search for that so this should give me a
404 all right so you can see I've got an oops page which says we were unable to
find the page you are looking for and if I change the content also over your like
let's say we enter some random text again I will again see this page can you
see it says page not found and similarly the way it happens for other
website is that for example if I go to amazon.com thing gibberish let's see
what happens it says four zero four document not
found and this is basically a custom page right and if I again go to anything
other any other gibberish you are I again get this page now heart is how's
that working so that is working because they also have using an application load
balancer which basically says anything other than the rules that we have
specified if someone goes or tries to go to a particular URL like the one that we
have specified route them to the four zero four page which is this right and
that's how the application load balancer is bringing you to this particular 404
page right so guys this is the application gateway the next service
that we have in networking is called the DNS rooms now what is DNS zone now any
website that you go to guys you do not enter the IP address for that website
right you basically enter a domain name and that domain name basically gets you
to the website right here never enter the IP address so similar is the case
with DNS zones as well so what DNS zones helps you and is basically it helps you
to route your domain to the azure resource where your application
basically resides for example you go to any domain website and you buy a domain
let's say you buy a domain which is a personal website dot XY said let's say
you buy this domain now you want to route what whoever is going to this
domain you want to route their traffic to one of the virtual machines that you
have launched new Azure portal now how would you do that for doing that you
will have to go inside the DNS zones service officer and you basically will
get some name servers right those name servers are basically DNS servers that
as your owns and those DNS servers you will have to specify in your domain so
very bought your domain you basically had a dashboard wherein you can put
theirs DNS servers which the domain that you
have bought will ping on right for example whoever would be going to person
website dot XY said would basically be routed to those DNS servers that you
will be mentioning over there right and those DNS servers you get from
DNS zones the next thing that you do is you specify in the DNS zones that
whatever traffic is coming from the domain that I that that basically has
specified route them to this particular VM instance I'll have to specify that in
DNS own service and that is how it will work so whenever you need use or
whenever you have a use case like you have the domain with you or even if you
want to buy a domain you can actually buy the domain at the is your dashboard
is well but let's say you have the domain itself right and you want to
point it to any as your resource which is existing in your dashboard so the way
to do that is going inside DNS one right and neck service guys is again a very
important service which is called CDN now what a CDN CDN basically means
content delivery network now how does that work what is a content delivery
network a content delivery network basically it basically improves the time
taken to basically serve you a website for example if you go to amazon.com or
if you go to facebook.com let's say do you happen to see the speed with which
the website loads if you have a good internet connection it grows very fast
right now where do you think the way Facebook website is actually hosted
right now what happens is now Facebook is a multinational corporation right
it's a it's a big website which is athletes being used for the whole world
right so they have different data centers I cross the globe from where the
requests are basically served but let's talk about a smaller scale website let's
let's talk about in telecom right we don't serve the whole world we basically
might have around 400 to 500 people at a particular time on our website right now
what happens is we get traffic from across the globe right we want everyone
to have a speedy experience when it comes to
them using our website right now what is what are the ways that I can ensure that
let's say my traffic is coming from us right the traffic that comes from us is
like let's say 60% of my overall traffic so what I can do is I can set up my
datacenter in the US so that the US people will actually get the website
fast right but let's say someone is accessing the website from Japan so in
that case what you will have to do is whenever you goes to in telepods com the
request will basically be sent to the US servers and from the US servers the
requests will come back and that's when his website will start to load right so
this basically increases latency latency is basically response time of a
particular application in our case it's a website in telecom now if you want to
reduce latency what are the different ways to do that the different ways to do
that is the first way that you can do it you can basically open a data center in
Japan as well and load all your scale up the number of servers that you have and
then also put a website over there and whenever people from Japan will
basically be accessing your website they should be seeing the Japanese version of
the website and the people in the US they should be seeing the US version of
the website the people in India using the Indian version of the website right
that's how it works when you use an e-commerce website like Amazon so you
have Amazon or ten you have amazon.com if for Australian people it's Amazon dot
u right so for different people countries it has the different extension
which basically means that that particular website is being served from
their home countries data center right but there is another way to basically
serve your website faster and that way is basically called CDN now what I see
the end Sirian basically caches all your state static data right for example what
kind of static data we talk about let's talk about videos let's say my
intellibid website also plays videos in the self-paced courses now if you are a
person who is in Japan and let's say my servers are there in the US so what will
happen is the moment you try to play a video on my website that video will get
downloaded to the near location often as your data center right
so while you watching that video that video has been downloaded on the data
center of the azure premises in Japan and what happens in later is basically
whenever any other person in Japan would try to exit that website he will get
access to it through the Japan server rather than the u.s. er so what I have
done is you have set up your servers in u.s. you have not set up your servers in
Japan but what I have done is you have basically enabled CDN on your website
and what happens in that case is all the static content is basically loaded on to
edge servers of azure so whatever servers are used to cache data cache
static data in the CDN services are called the edge servers right so whoever
is accessing my website from whatever part of the world if they have as your
datacenter near them my content will basically automatically get downloaded
over there and whenever there is a next person who is trying to access it that
already downloaded data will be served to him directly and that's how I reduce
the latency of my application without setting up servers in different
locations for different kind of traffic that I get right so this is how it
drastically reduces costs because I'm not launching new servers and at the
same time it serves the purpose of reducing your latency of your
application so this is what CDM profiles is gied so these were the core services
in networking of a short right now let's move forward and talk about a next set
of services which are basically the storage services all right so guys next
in a list we have the storage services of azure so let's go ahead and have a
look at them so guys essentially there are six
services that you should know of while you are using Azure in terms of storage
the six services are blog file storage tables queues derelict storage and their
box so let's discuss them one by one the first one being blob now what is blob
guy blobs it basically means binary large
objects right so if you have binary files and you want to store them on as
your blob storage is the answer for you right so it can store anything from
music files or it can store video files it can also store text documents any
kind of file that you wanna store on Azure can be stored on blog now blob
also can be used with your in conjunction with your websites and it
can basically act as a storage server for you right it also enables you to
host content which is publicly accessible over a link it can also host
static websites on its storage so that is what blob is guys right so most of
the time you would see that companies if they have a website all its static
content is actually picked up from the blog right so this was about the blob
guys our next service is file storage now what is file storage file storage is
basically a shared file storage that can be used with multiple computers that
basically means if I have five servers all right and let's say they need a
particular file for their working and all these five servers basically need
the same file so what I can do is I can basically create a drive or I can create
a storage point on file storage and all all these file servers can be used or
can be used to basically mount the file storage drive on them right and any
change that is done from one server on the central repository of file storage
would all those changes will be reflected on all the next four servers
the use that I can think of for file storage is let's say you have an
application which writes data onto a particular file right and basically this
application of yours is spread across multiple servers to ensure it is highly
available right let's say you have an application which basically writes data
right and this application is spread across
servers now what happens is you we've already discussed the role of a load
balancer you hit the load balancer and you're randomly assigned a server and
you work on it let's say you work on server one has a customer
you would not know bit server you are working on correct let's say you worked
on server one right you saved your changes and now this files are now saved
on server one but the next time you hit the URL now you are answer three if you
are on server three you would not realize it but your work you would
expect it to be there on server three as well
now here comes a problem if there is no central storage for all these files
servers they will be asynchronous in nature in the sense that they will not
sync data with each other whatever changes are done on the first server the
third server the fourth server and v will be unaware of it right and that is
when the arise the need of a central storage and that is what file storage is
for right you can it basically uses the SMB protocol SMB so basically SMB 2.1
and 3.0 is compatible with file storage and recently even Linux platforms like
Ubuntu sent to us in even Mac OS have started supporting SMB right so it
essentially would recommend you to use SMB 3.0 but there are some OS versions
which could have a SMB 2.1 for example if you are using ubuntu 16 or a one to
fourteen point o4 they are basically using SME 2.1 right so it has the
backward compatibility as well for older OS like one to fourteen 0.04 so if you
want drive on Ubuntu which basically is hosted on as your file storage you can
do that using the SMB protocol there are lists of software's that have to be
installed more on that we can discuss when we study file storage in detail but
for now because we are in a tutorial session we are basically I just told you
what well so it is useful so if you want to ask or if you want to think of a use
case where you need a central storage kind of a
thing now you know that you have to use as your file storage moving on guys our
next service is as your table now a zero table is basically a no sequel data
store right and it can help you to store structured data right it basically can
store structured data it has table tabular columns and tabular rules in
which you can save data but it is no sequel in nature which means the data
does not have to be in symmetry right your next data could have eight columns
the third the third row could have two columns the fourth row could have just
five columns it could take any kind of data but the only condition is that the
data should be structured among columns right so if there is any kind of need
for story the data of this kind you can use as your table annex a storage is as
your queues which basically is used with stateless systems right systems which do
not know of what all jobs are executing on the counter part or on the replicated
servers of their application for example let's say there is an image processing
website and what it does is the moment you upload an image you can process the
website or sorry process the image according to your need right so what
happens in this case let's say there are 200 people on the website and they have
all pressed the process image button together and what will happen there are
200 images that have to be processed and obviously not all 200 images can be
processed at the same time so what happens is the images are processed one
by one and let's say the in the backend server where the processing is done
there are five servers who are doing the processing okay now all these five
servers they pick up your image at random from the queue so let's say the
queue is first in first out whichever image comes first
goes out first as well so whenever the job of the first server is done let's
say it was doing the processing on an image and the image processing is done
it picks up another image from the queue the second server will also do the same
the third shovel would also do the same right so what happens is let's say the
first server it processed image one now the second third fourth and fifth server
they would not know whether the first image has been processed or not and they
can process it again so for those kind of scenarios and for solving those kind
of situations we had cues which basically streamline all the content
which has to be processed and whichever content is taken up by a particular
server it deletes it from the stack and then puts the second image or the second
object which basically has to be processed
right so for these kind of scenarios you use as your cues an external service is
data like storage now daily like storage is similar to that of a blob not a blog
but tables as your tables so data like storage can store data but it'll
basically used to store data for big data analytics it specializes in that
particular cool segment so if you have a big data analytic use case and you want
to store data for that data like Sony is should be the choice for you right so
this is about the lake storage guys the next service that we have is as your
data box now what does your data box now you might think of your company let's
say let's say you are working for a company and you have all the servers on
from ice right you have been working since 15 20 years in the sector that you
are I'm in your company and you did not have cloud at that time and what I have
done is your body our own servers you have hosted your own applications but
now the cost incurred for those servers is too high when you compare it with the
cost which is there in cloud computing when you opt for cloud computing so now
what what I have decided is basically that you want to migrate all your
applications to Azure data box ok or sorry to Azure so let's say there are
five Pete there's five petabytes or let's say thousand terabytes of data
that you have to transfer on Azure now in businesses time is very important
guys time is money basically you must have
heard of that phrase so everything is required at a fast pace and if you want
transfer a thousand terabytes of data imagine the data transfer
which would incur to you and also it would take a lot of time to upload
thousand terabytes of data wire Internet to Azure right you would agree on that
now to solve problems of this scale when you have a large amount of data that you
want to transfer onto as your cloud what as ever
does is it basically gives you a physical device on which you can load
your data right so Azure data box is a service in which you can you can
basically request a box kind of a system from Azure which basically ships through
your company you can load all your data onto it and then that box is again
shipped back to Azure cloud and all your data is again uploaded to Azure cloud
from their data center directly right so this not only reduces the time taken to
transfer data but it also hugely reduces the cost that you would incur in the
internet charges because thousand terabytes is not less data guys now
thousand otherwise is very less when I talk about a company which is existing
since fifteen years and when we talk about multi national corporations the
data can be in petabytes right and if you have a petabyte scale data that you
want to transfer to your it would take weeks if not months to transfer that
kind of data so then that is where data box comes in then data backs a data box
as your data box can basically support around five petabytes of data at one
time right now if five petabytes of data is less for you you can make request
multiple boxes from Azure to and multiple boxes will reach a location you
store your data on to it I mean you copy a data onto it and then
request us your back that you can take away your data boxes and they will take
the data boxes back to the data centers and transfer the data directly right
this will reduce your time to around one week of time so your data of petabyte
scale can be transferred to Azure in less than a week and at the same time
you would also save on huge internet costs right so this is what the data box
service of azure is alright guys so now let's go
and do this hands-on on Azure blob storage so let me quickly switch onto my
Microsoft Azure portal which is this alright guys so this is my Microsoft
Azure portal what I want to do is I want to show you guys how the blob storage
works in a CERN right so the first thing that I'm gonna do is I'm going to go
into storage accounts and over here you will find that there is a storage
account called demo environment why is it here because I created virtual
machine with the name demo environment right and obviously the virtual machine
requires some storage if you go inside this you will see that you have the four
options that we discussed that a blob files tables and queues what we are
interested in is blob so we'll just click on that and you can see that this
has boot Diagnostics for the new VM Linux that we launched earlier right so
all the logs for that VM are basically going inside this container so basically
if I go inside this container you can find that these are the two text files
which a present will basically will have the log data of that VM that we launched
right so but we are not here for that what we basically want to do is I'm
gonna show you that is how you can use block now no matter what you are working
on if you want to create a blob so it's for yourself you will have to create a
storage account right in our case this was the VM storage account if you are
starting afresh on Azure you will not have any storage accounts over here
right so let's create a soil con first then let's try to name it something
relevant which might go with the demo that we are doing so the resource group
that we want to add it in his demo environment and the storage account name
that we want to give is let's say in telepath - Chris your let's say this is
the one that we want to give okay where is the location that we want this
storage account to be in as discussed earlier we'll choose a nearest location
to us so in that case location would be
asia-pacific South India let's try to choose this location right now how do
you want the Poppins it could be either standard or premium
we can discuss this later probably when we'll discuss this service in detail now
for the sessions but for now let's let it be head standard right put everything
a default and now let's click on review plus create when you do that it will ask
you all the basically show you all the information with respect to the storage
account that we are creating if everything looks good just click on
create once you have clicked on create guides your soil account will take some
time and then it'll become ready for use and like earlier I specify all you have
to do is go to resource groups let's say this is the demo environment that we
created earlier I'll go inside this and now I can see all the things that I had
launched a present over here even the storage account that I'm
creating that is that is basically being launched right now I can see that also
over here as you can see it's in telepods as your this is so is the con
that I created and now it is visible inside demo environment resource group
on the other hand if I go to all resources I'd be able to see all the
resources that are there on my as your account but this looks a little messy
and that's why resource group Cyprian focus because I know that my resource
that I've deployed is sits inside a specific resource group and that
resource will basically I can name on the basis of what task that particular
particular resource is going to do since hours of the demo session what we have
done is all the resources that we are basically deploying are inside the demo
environment resource crew right and right now we just deploy this tools
account which is in telepods azure and as you can see it has been successfully
deployed now let's go inside the storage account and like we saw earlier you see
a similar user interface over here you have four options either to go inside
blob or to go inside files or tables or queues let me demonstrate to you how
plops works so we'll go inside blob skies and as you can see there are no
containers over here water containers containers are nothing but
root folders right for example on Windows you have the desktop folder so
dextra folder becomes the root folder or to give you a better example let's say I
go to the C Drive of your computer right whatever is inside the C Drive is the
root of the drive if you create a folder inside that drive and if then put files
inside that folder that is no longer the root directory right
so in whenever you are uploading files on blobs the first thing that you need
is a root directory and that crew directory is nothing but containers
right so let's go ahead and create a container so I'll click on a new
container and let's name this continua let's say as Intel apart ok and now
let's click on OK alright so my new container is now ready now you would
have noticed that there is something called as access level and right now the
access level is private more on this I'll explain to you as we move along so
right now we have not changed it to anything it's private and let it be a
private now what I'll do is I'll go inside this container and it says no
blobs formed right so what I can do is I can upload some files onto this
container and for doing that I'll have to click on upload and now I'll have to
select the file so let's so these are some images that I have on this system
so what I'll do is I'll just select a random image and now let's upload this
image onto my flow alright so this image is now uploaded on the blob and what I
can do is I can just click on this image and I can see all the information with
respect to it it's in it's 88 KB in size this is the last modified time this is
the creation time now here is something interesting guys I have the URL to the
file that I uploaded right now this URL can be embedded anywhere it can be
embedded on a website it can be embedded on a post 30 making to social media
this basically is a file which is hosted on the internet and anywhere
anyone who has access to this particular link would be able to see the image that
you have just upload awesome isn't it right so what I'm gonna do is let me
copy this link and let me open it in a new tab right and I'll hit enter so
right now it says the resource not found this specified resource does not exist
now why do you think this happened this happened because we did not give access
to the container for general public Hugh's right what do what does that mean
if I click on the container and I click on change access level it's right now
private which means I can only access this blob through the azure portal all
through you know CLI or through some other way but only that way would be
working now if I want everyone and anywhere who wants to access this file
I'll have to go ahead and choose anonymous read access for blobs only
right so when I do that I just select a blob over here and I click on OK now the
permissions for the files inside this container has changed and what does that
mean that basically means if i refresh this now
I would be able to see this image over here right similarly if this link is
copied by you and paste it on a browser you also would be able to see this
particular image right so guys this is what a blob storage is you can use blob
storage to dump data such as logs such as images such as videos that could be
either private if your application does not want to use it and you just want to
keep there keep it there for you know just keeping it there or what you can do
is you can provide it a public read access on the blob level that is for all
the files will have public read access and those files will then be able to use
by just going to their link let me upload one more file free ways so that
you are with the concept of defining permission
so I'll define the permission on the container so I don't have to do anything
now I'll just upload the images and this is a new image which has been uploaded
and now if I go inside the image and I copy the link and if I paste link over
here you can see I am able to see the screenshot even here right the moment I
go out and I change the access level to private and I click on OK
now if i refresh ok so it did not change the permissions of the files not visible
okay so that basically took some time for the permission to get allocated on
the container which basically means like if I go out and if I change the access
level again to blob and if i refresh this files yes I am able to see them
right and again if I go out and if I change the access level to let's say
private again and if i refresh the first refresh it is there the second refresh
it is there but if you wait for some time like a minute or so this axis will
basically be gone as you can see it now sees resource not found right so with a
click of a button you can basically control all the files permissions which
is there inside a container and grant them public or private relaxes right
guys so guys this was a brief demo about blob storage on a job let us come back
to our slides right so the guys I I hope now you clear
with what blob storage is now let's move forward and talk about our next set of
services which are the database and the analytical service in Microsoft Azure
right so basically there are five core services that you should know off when
you are dealing with Azure in terms of databases and analytics so the first
service is the SQL database service which basically is a database as a
service right you do not get an access to the operating system on mistis
databases stalled you basically get access to only
the database and basically if you remember what we have studied earlier
for a service which does not gives you access to the operating system but in
turn gives you access to a platform kind of thing where you can interact with the
service and probably upload something onto the software being used is called
platform as a service so as your SQL database is basically a platform as a
service which not only gives you the independence from the infrastructure
side but at the same time it is highly scalable and it can provide you up to
212 percent of return on investment that means that whatever money you are going
to spend on Azure SQL database the benefits that it is going to provide you
is going to give you back two hundred and twelve percent of the money that you
would be investing in using Azure SQL database isn't that interesting
right so this this was the first service among the database and let analytic
services in Azure a next service is cosmos dB
now where is cosmos DB is basically a fully managed database service again
just like SQL Server but in this case what happens is your database is
extremely highly available which means it is distributed throughout the world
using Azure regions and it is highly available right so basically what
happens is when you launch a cosmos DB cluster it distributes or it creates a
replica of the database and multiple regions as you specified right and the
cool thing about cosmos DB is whatever region you want to close or want to stop
and probably you do not want your database to be replicated in a
particular region you can do that with the click of a button if you want it
again in some other region you can again do that with a click of
so everything is fast everything is at one the whole control is at one central
place and you can replicate your database accordingly with that one
platform or with one that one cosmos DB dashboard that you get right now again
you have multiple models that you can implement in cosmos DB for example you
can implement multi master replication or you can implement multi write regions
replication which basically means usually when you have a distributed
database which is highly available you have one region where all the lights
take place but the read can actually be done from any of the regions where the
database has been distributed but in case of cosmos DB you also have the
option of configuring multi region region multi write regions which
basically means that let's say I want to emit India and mine and I'm a user who's
using your application right now if I am using your application probably the
website is being served from the nearest edge location of my location for example
you must have enabled content delivery network on your website if you want to
increase or decrease the latency on your application right now to decrease the
latency enough of your application you have implemented CDN so I am being
served the website from the nearest location of my home from where am i
accessing your application and at the same time the application itself will
also accessing the database right now imagine the website is fast but the
processing that it does that is the data that has it has to fetch from particular
database it exists and altogether a different region which is on the other
part of the world right on the or on the other side of the world right that'll
again increase the latency of your applications of so VD so you don't only
need your front-end component to be highly distributed you also need your
back-end and your database to be highly distributed and highly available right
and that is the sole reason when you are dealing with the audience which is
spread across the globe you also have to take care that the
processing that these that each of these guys are doing would also be available
to them in the lowest time possible and for that what you have to do is the
backend processors or the back-end servers and the database servers also
have to be near them so that whenever they are writing any information that
also information can be written faster now if you have a right one and read
anywhere kind of a distribution in that case you'll have to write at a central
location let's say the center location is the US and your databases which are
basically the read replicas are actually distributed throughout the globe what
are the replicas real replicas are basically the copy of the central
database from which you can read the data and these read replicas basically
get synced or basically are synced with a central database with the so whatever
you write on the database is automatically and quickly replicated on
the read replicas as well so it's very fast right so in terms of reading the
data you can get a very low latency but in terms of writing the data if you want
low latency as well then you need to have multi write regions and that is
exactly what this feature is for so in a nutshell cosmos TB is basically a fully
managed database service which provides you with a large-scale deployment of
databases throughout the globe and it also gives you the option of deleting or
replicating your database on a new region entirely with a single click of a
button right so that is what cosmos DB is for you guys right the next set of or
the next service that we have is the data factory service now what is the
data factory service it is basically an ETL service which is an extract
transform and load service which basically can take data from multiple
sources and then transform the data according to what has been quoted in the
application using regular service which can be used in conjunction with data
factory and then load the resultant data into a beer
service which can be used to analyze the data right so usually when you are doing
this kind of an infra setup you need a lot of planning you need a lot of
mediators between the technologies which are being integrated for example the
data can be coming from multiple sources such as social media websites it can be
coming from emails it can be coming from chats it can be coming from reviews it
could be coming from anything right so all these data sources are basically
first of all aggregated and the data is first put in the raw form in one
particular place then what happens is according to the logic that you have
specified for the transformation of this data for example you probably want to
see the data in a particular view so that basically means you would have to
first transform this raw data that you have got from multiple sources into a
structured format so that is what transformation is so once you have the
resultant data that you feel is the correct way of getting the data or
representing the data the next step that you need is obviously if there are a
million rows in your data set you won't be able to find what you are actually
looking for or basically it might take a lot of time for you to analyze a million
rows of data that you have just created so what you can do is or what what the
industry came up to this particular problem was that they created a business
intelligence application what are the business intelligence application it
takes in the data set and it quickly creates patterns it creates quickly
creates graphs which can actually help you to better understand the data right
so data factory also happens to have the integration capabilities with a lot of
BI tools the most prominent one is Microsoft's own power bi tool but it
also can integrate with tableau and other BI systems all right similarly it
can be integrated with multiple data injection systems as well so it's a
total ETL system that you can deploy on Azure
and connect your various sources and outputs and get the resultant is output
okay so then this is what data factory is our next service is event hubs now
what is the event UPS event hubs is basically a place where you can again
take in a lot of data just like data factory you can extract data from
multiple sources but in this case you are not actually extracting data but the
multiple sources from where the data is being generated the data is being pushed
on to event humps okay so the data is pushed on to event hubs and the event
hubs job is to process this data and see where the stage data has to go next
right for example let's say the data is coming from a social media website right
so you analyze okay so this data came from a social media website it has to go
to this particular service this data is coming from this particular website it
has to go to X or Y particular service similarly you have millions and millions
of data coming every second or data packets coming every second and the sole
job of event hub is to basically analyze each packet and directed to the
corresponding consumer of that packet right so this is what event UPS is for
you so this is done very fast so the speeds claimed by Azure is around 1 Gbps
that that is it can process 1 GB of data per second so the message is coming to
end event ups can be processed that faster and it also does parallel
computing of each data packet right so this is how or this is whatever event
hubs is for guys now obviously event ups cannot be just used alone right you have
to use it with in conjunction with a host of applications for example the
producers that is the data producing resources have to first combined with
innovator and then you ever define the rules in event hub as to what data goes
where our what data goes to which consumer right and then your consumers
are defined on an basically connected to event hubs right
so basically event hubs in the most basic sense is a single point of data
ingestion so you don't have to worry about where should I send my data send
all your data to event hubs and event up will decide where your data packet has
to go based on what rule you are defined inside that system okay so guys this is
what event hubs is a next service is data link analysis right so we already
saw what data leak is so data link is basically a storage for big data
analytics right so data lead analytics is basically a distributed cloud based
data processing infrastructure right so it is basically architected to perform
data processing on the Big Data I mean the data that is stored on data Lake is
has to be processed by something right and that processing can actually be done
by data Lake analytics so it is basically the architecture is based on
yarn so yarn is basically a component of the Hadoop ecosystem and it can all and
basically pairs with your data like store or where basically we have stored
all our data and can perform the analysis on the scale of big data
analysis that big data is huge amount of data with variety of data in it right so
all the data has mixed and matched fears right the data is in the rawest form
possible and the processing can be done only by using selected tools which can
actually make that happen if you use simple tools or if you use traditional
tools the time taken by those tools to process this amount of data will be huge
right and that is why you need loop tools to basically make this process
faster and as your have used the same technology that loop uses and have made
it even faster for you right so data link analytics is used in big data
analytics it will it will basically use for processing data and what data the
data which you store on as your data leak store
right so this is what data Lake analytics is guys moving forward guys
now let's move on to a next domain which is the AI and machine learning domain in
Azure alright guys so in Azure you basically have three core services in
the AI and machine learning domain the three core services are cognitive
sources then you have bought services and then you have the awesome machine
learning studio so let's understand the each of these services one by one the
first service is the cognitive services now what are cognitive services
cognitive services in a server are basically api's or SDKs you know which
have been developed for a developer which can be integrated in his
application and these API is in SDKs basically interact with the machine
learning models which have been created in Azure right so cognitive services
would include services like vision which basically gives you image processing
capabilities it also has text and analytics services which can basically
do natural language processing for you so all these services all these services
in Azure are ready to use and to use them you will have to integrate them
with your applications and those integration can happen with the help of
ApS can happen with the help of SDKs and once your application is integrated with
these services you're charged on the basis of the number of requests that you
make to these particular service for example the vision service and the
cognitive sources office your it's pretty awesome any image that you upload
it will tell you each and everything in the image whatever is present and how is
that possible basically as your has processed thousands and millions and
even billions of images already right and they have trained their image
processing machine learning model in a way that now it gives accurate results
right so instead of creating your own machine learning model for image
processing you can actually use the azores
machine learning model and you can get your work done so it might have used
deep learning it might have used only machine learning we do not know but the
results are very accurate and if you want to use AI and machine learning
capabilities office your so the models which are trained by us or if you want
to use them directly you can actually use the cognitive
services of fuzzy and to integrate them with your applications you have seven
lay PS and SDKs available to you so this was about cognitive services guys now
there is a derivative of the cognitive services that Roger has launched which
is basically called the bot service now what if the bot throw is guys the baud
service is basically a chat bot which has been developed by Azure which is
totally based on AI and it basically makes uses of the natural language
processing service or the capabilities which are basically there in the
cognitive services of a cert it makes use of that right and it is very
to-the-point and you don't have to train the model to become expert in the
conversation that you are having but it can actually learn from every
conversation that you have with the AI chat board of is your and you can
actually tweak its setting so that it becomes custom design for you so that
your customers get the right answer every time they ask a question from the
bot service officer right so in a nutshell as your bot service is nothing
but a chat bot service it's a pre-built chat pod service which you can integrate
in your application and this chat bot service has already been trained by
Azure and it gives accurate answers to questions and but obviously your you
will have to tweak it according to your applications you'll have to do some
settings so that it answers the correct way of answering the answers would be
according to you and according to your domain of application okay so there's
this was the board services and next set of service is the machine learning
studio now what is the machine learning studio guys it's basically a very
simplified version of using machine learning for example
the most simple language that I think for starting off in data science is our
right so as a beginner if you start little using her or start learning Ari
you will have to spend some time to first of all understand the syntax how
it works etc and then you'll get a hang of how to create mortals how to train
them how to test them etc right what machine learning studio says is you do
not have to learn any programming language it has drag-and-drop interfaces
so all you have to know is their science AI or machine learning conceptually and
if you understand or if you have the principles of the data science concepts
understood pre well what you can do is you can start off by creating your own
machine learning model by just doing drag and drop from the UI of the Azure
machine learning studio and you'd be up and ready with the first model in under
five minutes of course for a specific set of data set obviously you will not
be taking millions of billions lines as a result otherwise that will take a
little more time but yes you can create your own machine learning model with
Azure machine learning studio without knowing any programming language right
so that is the power of single on a studio and most coolest feature of it is
it has a drag-and-drop user interface so everything is drag-and-drop you just
drag and drop whatever you need whatever data set you need whatever record them
you want to implement whatever thresholds you want to implement you
just I can drop everything and it will work like a charm okay so guys this was
the machine learning studio application for you the next set of services that we
are going to discuss guys are the identity services in Azure alright guys
so now let's go ahead and understand the code of your services in the identity
domain right so that is probably the most important service when you are
dealing with identity in Azure is as your Active Directory now what do
you mean by identity identity basically means when you want to give a particular
person some access to a particular resource that care is being used on
Azure cloud right so as your Active Directory is basically a fully managed
multi-tenant service from Microsoft that basically offers identity and access to
particular users in your organization know if you're acquainted
with how Microsoft Windows solo works you would know that there is a server
active directory as well right there's a Microsoft server active directory as
well in which you can specify for your on-premise applications which users have
access to what act extend right for example some people can just have deed
access or some people can have read and write access or some people can even
have admin and access to a particular application right and most important
part about Azure Active Directory is that it can integrate with your
on-premise server Active Directory and can also authenticate people who are
there on the on-premise infrastructure and want to use the compromised
resources like or on from my software can also be authenticated using Azure
Active Directory and if those on provides users then you want to use
resources on Azure that also will be authenticated using Azure Active
Directory right now how does all this works so it's a very simple concept guys
what you can do you can actually add users in Azure Active Directory right
now you can add individual users and you can assign individual roles to each
users for example if there is a SAS application which you are deployed on
Microsoft Azure giving access to that particular SAS application to a
particular user would be possible by just adding a user and giving him the
access of this that SAS application right and you can also do a mass
allocation of a particular permission for example let's say you want to give
administrative privileges to a set of people now how would you give that so
one way is whenever there is a person who comes into a company whom you think
in can be given admin access you will add as user and provide him the admin
access for what you can do is you can actually define a group on which you
have specified admin resources or admin privileges and whoever users will be
added to this particular group they'll get those permissions or will get those
privileges automatically right so in this case what you will be doing is
whoever is coming into the organization you're just adding him to
the group and the permissions inheritance takes place automatically
because that group has been assigned the admin privileges similarly if you want
to create some other group for example you want to create a group for some SAS
application you can also do that right and this concept is a part of Azure
Active Directory and again in a nutshell Azure Active Directory is nothing but
it's basically a directory in which you can add users and specify what these
users can do on Azure or on entre my software right and these users can be
managed in two ways either either you can manage them individually by creating
a user time and again all what you can do is you can add these users to already
existing groups that you have created with the respective permissions and
they'll inherit all the properties which have been specified to that group ok so
then this was the azure active directory in this will about the identity domain
in Microsoft as you are moving forward guys now let's talk about the management
tools in Microsoft Azure that are very important for you to know while you're
working as an azure professional all right guys so let's go ahead and
understand the management services of azure so the first service is log in
analytics guys now what is log in analytics so basically the full name of
this services log in analytics workspace now what you can do with log in
analytics workspaces you can dump all the logs for with respect to what is
happening on a particular as your resource on to this workspace now how do
you do that the first thing that you do is you basically go to your management
console and you go to log in analytics workspace over there you will have to
create a login analytics workspace and specify the name anything right it's
pretty straightforward once you specify that the next step would be to add a
data source so if I will show you quickly I can just jump on from a
management console I can go to log navigates workspace now here is the
sample log analytics workspace that our creator I'll just go inside it and as
you can see it says connect a data source so you
can create but you can basically connect as your virtual machines over here or
you can specify some other sources if you want to and you can also specify as
your activity log so stream over here alright so this will start accumulating
logs in this workspace once that is done we will have to specify monitoring
solutions as to how you want to read the logs and what do you want to do with
respect to a particular level of love for example you have info level you have
critical level in logs right this basically tells you that a particular
command failed or a particular command passed so if the level or the level
field of a log if it let's say info that means it's fine it can go in but let's
say the level should in the log so what do we mean the level field a little
nothing but you can specify in your log notation that whenever there is an error
I would specify a term such as level okay and I would specify that is equal
to critical but if things are okay it's just for know-how as to which command
has been executed and that command executed successfully in that case I can
specify the level as info okay and then I can specify in my monitoring solution
that whatever log comes in check the level field in that log and if the level
field is critical you can flag that log basically so all that can be configured
inside the monitoring solution so if I were to click on view solutions you can
actually check out what all monitoring components you can actually add on your
log in a textbook space so as you can see you can add as your security sensor
you can add optimize catalog so there are a lot of things that you can
basically add on to monitor logs right you can choose according to your will it
is available in Azure marketplace so once you specify the monitoring solution
your log starts to flow in and in your monitoring solution is after specify
which log so flag and that's how it's going to work right
this is the log analytics service energy Oh guys in and it helps you to manage
your Ezio resources effectively pinpoint where the problem is occurring go over
there and see the logs see on which command problem occurred fix a problem
and then go ahead with your daily routine so this is one log analytics is
guys the next service is cost management and billing so this is basically a
native application of a service which helps you to manage your bills
in a zone right so for a few things which I can tell you on the top of my
head which comes in is that he can manage a budget you can specify that
might as your budget or my as your bill should not go over a particular limit
you can specify that or you can so when you when you say it would not go over a
particular limit that means if it goes beyond that all your services will be
stopped but when you're in a business you actually don't want that so what so
this particular limit is called a hard limit that is you want to limit it into
that particular cost there is another kind of limit which is called a soft
limit will basically means it will alert you that your budget has been crossed
and you can actually make forecasts also in this particular service based on your
uses lets you have used as you're heavily for five days so it gives you a
forecast that if you happen to use as you're in this similar fashion this is
what your bill is going to be at the end of the month right and then you can
actually see what kind of tweaks you can make in your services so that your bill
comes down so all of this is possible in cost management and billing service
officer now this also is under the management domain reason being it helps
you in managing your resources in a better fashion the next service guys is
a very important service which is an automation account now what is an
automation accountant's it is nothing but it's it's a way of deploying as your
resources so how do you reply as your resources so we have seen few of the
resources in this session today therein we deployed a virtual machine we
deployed a storage accounts or that was possible one by one I mean I
had to go manually into that service I had to click on add I had to specify a
name and that's how you know my resources were deployed but in
automation account what happens is you can create run books now what are learn
books you basically specify a code where and you specify all the resources and
their names and their configurations that you want to deploy and then you
upload that run book onto the automation account and then you run that run book
then what it does is it automatically creates the resources for you you don't
have to do anything just put that one book over there click on play button it
will create all the resources for you with the exact same configurations that
you have specified in the code it is particularly helpful when you're setting
up a large infrastructure wherein you'll have to deploy thousands or to thousands
machine with each where the two thousand machines are basically divided into
multiple sets it could be back in Somerset it would be database service it
could be front-end server set all of which will have different kind of
configurations and if these this is the kind of a scenario then run books would
be of great help and that's where automation account also comes in which
basically helps you to play those angles okay our next service is metric skies no
matrix is a service which helps you to visualize what all is happening on your
as your resource for example what is the network throughput what is the CPU usage
right what are the memory usage all of that you can see inside metrics now you
can actually check a 24-hour log as well in metrics and see at what point of time
does your resource or your or the particular metric that is checking could
be CPU Ziusudra memory using is actually rising and then you can plan accordingly
as to how to plan your or how to scale your infrastructure at that particular
time right so this is what metrics are useful for it basically gives you an
overview of how things are going so if you think about it it is nothing but
metrics sorry it is nothing but logs which is basically basically getting
visualized in from of grass but nonetheless it's very
helpful and that is a service which is being provided to you by a sir
by the name matrix all right guys so these were all the services in the
management domain guys are next up we'll be discussing a hands-on so enough of
theory I guess we have discussed probably most of the important services
which are there in the azure portal now what we'll be doing is I will be showing
you an application which basically exists on my local host and what we'll
try to do is we will try to migrate that to Azure and we'll choose the services
based on the knowledge that we have gained today we will choose the services
accordingly then we will set up the infrastructure and then we'll see how it
works on the cloud so there is let's go ahead and do that alright guys so let's
go ahead and let me show you what all we are going to do so the first thing that
we're going to do is basically I have a website that I've already created on a
local host which basically can upload data to Azure blue blob storage through
the website so I'm going to show you guys how I have configured it and how
can it basically upload data to us your blog right so the second point is create
a new container and upload files to this container from the website right so we
will be creating a container and our website we're going to configure to
upload files to this particular container third thing is we have to
create a MySQL database on as yours right now what is happening is my
website is basically storing data on the localhost MySQL engine what the hands-on
expects us to do is we will have to deploy MySQL database on Azure and the
website should basically push the data on to the MySQL database which will be
there on the portal right so this is the third step and the force F finally we
want our final website to be uploaded on as your web app and basically we have to
deploy it using the local git method what it is don't worry about it as you
move along you get it right so let's start off with the first point right now
which is demonstrate the website on the local
so let me switch on to my browser so guys this is my Azure portal and if I
have to demonstrate to you my website it basically exists on localhost as your
one right so this is my website guys and what it does is it basically can upload
data onto Microsoft Azure but right now what we I'm not basically configured it
to connect to my Microsoft Azure storage account I'll show you how we can
configure it right so once a file is uploaded on the blob it also makes an
entry of it in the database right now whereas database the database exists on
my local host so what I'm going to do is I'm going to just open the my ash in the
console right and my - cue console basically has a database called images
so if I do a show databases you can see that it is an images database so this is
the database that my website will be interacting to so let me just use images
and in this database what I have is a table so if I do a short tables command
you can see that I have a table called names right but right now there is no
data inside so if I do a select star from names as you can see it is empty
set right so when the data when this website will be uploading file when it
will be successfully uploaded next step would be to basically put the data
inside the database as to what is the name of okay so let me show you the code
guys let me show you how the code looks like this is my index of PHP let me just
open it alright guys so guys this is this is how my code looks like so the
first thing that I'll have to do is basically I will have to enter a
deployment key over here which will basically be my connection string and
the second thing that I'll have to do is I have to enter the container name which
I am going to create in a little right so let's first jump onto our job
portal and what we want to do is go inside a storage account so guys this is
my storage account and what I want is the deployment key so in the storage
account you will have to go to access keys just go inside that and here you
will find two keys you can take any of these two keys so what I'll do is I
let's say I take t1 and this is the connection string so I'll have to copy
this and I'd have to come back to my code and I'll have to replace it over
here so let's replace the deployment key and now I will have to remove this part
out of the planning this is not required right but this we will remove and this
is my deployment Keeler's all right next step is to basically enter the container
name now the hands-on basically expects us to create a new container so let us
do that so let's go inside overview and let's go to globes so I
have a container in telepods over here let's create one more container and
let's call it new right and let's give the exercise globe and let's click on ok
all right so my container is now ready and it's called new let's name the
container over here so my container name is new okay everything else looks good
so right now this is this is basically the connection information for my MySQL
database right now it is going to upload the data onto my localhost - kale all
right so let's see if I can upload a file now let's save this index of PHP
let's come back to our website let's refresh it once and now let's choose the
file which we want upload let's go inside let us try to upload let's say
database so this is an image which is called database let's select this and
let's click on submit so it says blob updated uploaded successfully and new
record created successfully so let's verify that in our database so I'll do a
select star from as you can see there is a new entry over
here which is six eight zero three four double five seven three this is the name
which has been assigned by my website so this is basically a random name the
reason for that is that it could be that you are uploading duplicate files and if
you upload duplicate files you don't want the names to coincide so what I am
doing is I am assigning a random name to every file that I'm uploading right so
this is the file which has been uploaded and we can check it over here if I click
on check list this is the file which has been uploaded and if I click on this
file now I can download that file basically from also it says blob not
found and the reason for that is that I have to change the URL inside my list
website which is basically if I click on check list this is the list dot PHP I'll
have to change some code here I will show you that but first let's go inside
the new container and as you can see there is a PNG file over here
now this PNG file I can directly download from a list file but in order
to do that what I have to do is I'll have to click on this file and I have to
copy this URL and I have to see what is the prefix over here so the prefix is
this so I'll have to copy this prefix and now I will have to go inside my code
and open the list of PHP code and this is the URL that I'll have to replace
let's replace it alright great now let's save it come back to website this is a
list or PHP let us refresh this once and now when I click on the file you can see
that I have downloaded the file automatically let's click on this file
and this is an image which has been downloaded now let's try something else
let me show you the image first and then I'll try to upload it so what I'll do is
I'll go inside downloads and let's say I want to upload this machine file so this
machine file looks something like this okay now we're going to float this
machine file let's go inside our website fill it through the file and now let's
select the machine which is basically there
this we will upload will click on open and will click on submit so it says blog
uploaded successfully if I go inside my container and i refresh it you can see
there's a new entry over here even in my database if i refresh there's a new
entry over you and now what i can do is i can basically just go to my list
refresh it this is the new entry click on it file gets downloaded click here
and this is the file which was a pool so what is basically happening over here is
this file is now posted on plob and any of you if you click on this link you
will be able to download it but the problem is that this website right now
is only available on my local host and i have to make it available to the world
now how can I do that so first thing first let us go back to our slides
and see what is the next step so our next step is that I want to create a
MySQL database on as yours right now my database which is being updated is on my
local host but what I've want is that it should be
updated on my as your - fill database now let's see how we can do that so the
first thing that I will do is I'll have to go back to my as your dashboard I'd
have to go to services and I'll have to go into databases and this is the
database that I want to launch as your database for my school so let's select
it let's add a database I want to include it inside my demo environment
the source group the server name let's specify it as in internal one write the
admin username let's specify it as enter the password let's specify it as until
at the rate 1 2 3 ok it says your password cannot contain all a part of
the login name no shoes let's name it as as you're at the rate 1 2 3 same is the
case over here alright and where do I want to launch it I want to launch it in
South India the version is 5.7 which is great
now let's change the amount of specification that we have for this
server let's click on configure server I want a basic configuration I basically
want one cold and I want the least amount of
storage is 5gb auto growth no I don't want it backup retention period the
lease is seven days which is fine and I think that's it let's click on ok now
the server has been configured now let's click on review plus create okay so our
username is Intel my password is zero at the rate one two three and now let's
click on create all right so my deployment is underway which basically
means my MySQL database is now getting deployed let's wait for some time let's
wait it will wait for it to be complete and then we will proceed with our demo or I guess so my deployment is now
complete so I can just click on go to resource and I am in so this is my
database guys now in order to see or in order to access my database this is the
server name that I will have to use so let's go ahead and select this so my
server name is this let's copy and let's come back to our command prompt and what
you want to do is MySQL - edge this is the connection string or the server name
next thing is I have to specify the username which in my case is this so
that's copy it paste it here next thing that I want is the password so password
is your at the rate 1 2 3 now this will be really it says client with IP address
is not allowed to connect to this MySQL server so let's solve that let's solve
that so we will go inside connection security and what I want to do is I will
specify my client IP which is this alright that's been added also guys turn
this enforce SSL connection off because right now we don't want to get into
making an SSL connection since this is the demo what I guess everything looks
good now let's click on save and once this rule has been said what we'll see
is that we should be able to connect to database alright guys so it says it has
successfully updated the connection security and now when I go back to my
command prompt and I try to into the same command with the password is you're
at the rate one two three you can see I have successfully connected to my MySQL
database on Azure now I'd like to do a show databases right now you can see
there are only the default databases present so let us change that so I'll
just exit or before exiting let us just create an empty database which is images
so we'll use images and let's create a table so create table names and the
names will be name space back at 20 so we have created it and now let's just go
to our code over here there's my code cards now I will have to specify the
server name so server name in my case would be this let's copy it this is my
server name a user name is this and a password Zod at the date one two three
okay everything looks good let's save the file
similarly in index dot PHP let's do the same changes my surname is this let's
copy it my username is this let's copy it and my password is your at the rate
one two three all right let's save the file and now let's go to a website let's
try to check the list let's see what it shows us it shows us empty because
there's no data in the azure SQL table great now let's choose the file and now
let's try to upload some random file lets click on submit it says blog did it
successfully a new record created successfully let's check that so I just
do a select star names and I can see that there is a new entry over here if I
do a checklist this is the new entry if I click on this I can download the file
and this is the file that I basically great now my
website is updating data on to your database and at the same time it is
uploading data on to my storage account inside the container which is new right
it's conscious blobs new and this is the new file which is basically just a
product great guys now what I want to do is I want to make this website public
right this website is working fine over here but I want this website to be used
by everyone in the world now how can I do that in order to do that I will have
to go on to and we studied or we saw a service which can basically just upload
a website and not ask us to install any software launched asked us to login into
the operating system do some configuration nothing it will just give
us a dashboard through which we can upload our website so let's go ahead and
do that so the web the service that I'm talking about is app services so let's
go inside app services and let's click on add so now it will ask me with
resource group so I'll say demo environment name of the instance let's
specify it as intel demo all right okay it's already been taken so let's specify
into alright this seems to be available what do you want to publish I want to
publish the code the runtime stack it's actually PHP 5.6 this is the one what is
the region that we want to deploy it in we want to deploy it in South India so
that's like that one great let's change the size of the server which will be
launched so basically we are in dev or test environment right and this is the
minimum configuration is that we can launch all right let's click on apply
and now let's click on review and create so this over here we can basically do
all the change all the necessary review as to what all we are launching so we
just selected PHP 5 point 6 and we specified the
that's all we did right now let's click on create now there is what will happen
now is basically deploy so on which it will install PHP 5.6 you know install
apache and then it will give me a URL and when i go to that URL i will
basically see a sample app right it is not giving me access to the operating
system it does not ask me to install any software all that was done over here was
we selected the runtime stack which is PHP 5.6 we specified the server
configuration and that is all that is all we need to configure and that is all
we get access to now once it gets deployed I will show you that is how you
can upload your code onto this particular web app so as you can see it
is already deployed now let's go to resource and this is a resource guys the
status is it's running right now and if I browse this web app right now you
basically show me a sample web app which has been deployed right let's wait for
that website to appear since this web app has just been launched it might take
some time for the web app to show the web site but nevertheless it will show
the website let's wait for it to actually show us let me try stopping it
and refreshing it once all right it's gonna take its sweet time so let's give
it that alright so as you can see hey app service developers your app service
is up and running trying to take the next step and deploy your code great so
this is what you want to do now to deploy your code you will have to go
inside deployment center so let's go inside that so once you're inside
deployment center it's asking me where is my source control right so my source
control is basically on my local right so but it my local is not a git
repository as of now so what I can do is I can just go on my coal which is with
your zero 1 this is the folder and let me just right click here and click on
git bash right let me just make the text a little bigger for you so that there is
visible alright so guys now what I'm going to do
I'm gonna initialize and n3 git repository git init and now I'm gonna
add all the files in the get file system I'm going to stage it for the good file
system so once this process is done I'm going to commit it get commit - them and
then at first now let's say the files straight to the files are saved now what
I want to do is I want to upload this particular code onto my yoga back right
so going back to the portal I do have a local get where I am a code checked in
let's select this option and click on continue now it's asking me which build
provider to choose don't worry about here don't worry about what is as your
pipelines and what is kuru engine basically if you select kuru engine you
don't have to do any configuration more on this we can discuss in the further
sessions but right now just select the kuru engine and click on continue
and now it says your local gear apology URL will be generated upon completion
branch would be master app service this build service great let's click on
finish so now it is basically going to set up the get environment for me on the
web app so and what I will get is basically this URL which is my gate URL
so this URL I will be using to upload my code but before that I will also have to
set the credentials for my gate system so to set that I click over here FTP
your credentials and I go to user credentials and I can specify anything
over here let's specify a username let's say the user name is in telepath and
let's say the password is e which is saving credentials for the success great
now what I can do is I can just close this and now I just have to go to my
resource I have to go to deployment center and now I will just take the kit
URL over here right this is a get URL let's copy this
and now get the move and let's try to paste this - link now yes so this was
the syntax git remote add right and now what I want to do is I
want to push my code onto a zero so Kate Bush is your master hit enter and I will
ask me the credentials so credentials is Intel apart and the password is Intel at
the rate one two three Oh authentication failed I think I forgot the password no
shoelace so I can just specify it again so Intel at the rate one two three and
in two three one two three inches okay inches are saved let's go back here it
pushes your master and now it should basically push my code on to the web app
now it will take some time over here guys don't worry about the time it will
take around a minute or so to applaud your code so let's wait for that time
and let's hope everything goes well all right the process has started so it
is updating the branch now copying all the files great
so my code is now uploaded on the app and it says it will restart in ten
seconds great so meanwhile what I can do is I can go to my web app and I can just
refresh this and as you can see my website is now available over here
alright if I click on check list you will see an error and I will tell you
why that error would be there it says connection field client with IP
address is not allowed to connect to this my ACL so how can you solve that
just go home and go to your database go to connection security and allow this
allow access to your services just click on on and as a security feature just
remove the IP address that you used earlier and now let's click on save so
now my security settings will get updated and then I would be able to use
my website on the web app alright so my settings are about to be abated let's
wait for that time and then we'll go ahead
alright guys so my data is has been updated successfully so if I go here
back and I click on refresh you can see the list is now being generated let's
try to upload a file guys let's try to upload let's try to upload this word
file ok let's click on open and let's click
on submit it says blob updated successfully let's check that lets go to
resource group let's go to our torrid account let's go you know and this is
the docx file which was just updated and if i refresh the list I can see the docx
file over here is there if I click on it I'd be able to download it in guys with
this we have successfully completed our demo but let me show you a very awesome
thing that comes with connecting get to your web app right so now I can just
simply go to my index of PHP and let's say I want to change this heading okay
so let's try to change the heading I'll just go down upload towards your
blob and let's add a little bit of text over here let's say welcome to telepath
okay let's save it come back to our gate terminal let's do a bit status you can
see that index dot PHP has been modified great let's scale this file let's commit
this file updated index great and now let's push it so get push will
again take a minute or two and then our files will get updated on the web app
automatically alright so my code has been updated let's go ahead and check it
so let me just refresh this website and as you can see my code has successfully
updated over here it says upload to Azure blob
welcome to in telepods and this is exactly what we changed alright so guys
with this we have successfully completed our demo let's summarize what all we did
so let me come back to my slides so we demonstrated the website on the
localhost fine then we created a new Khan
and then we uploaded all the blob files over there after that we also deployed a
MySQL database on Azure and the localhost website was then able to
insert data on to the MySQL database known as your and then finally we also
deployed and as your web app and we deployed our website using the local get
method and finally we also checked if we updated anything and we pushed to a get
method the files were successfully being uploaded onto mine either right
so with this guys we've successfully completed our demo our next topic is
basically quiz so let's go ahead and see some of the questions that we can answer
after attending this session ok so our first question is which of the following
is not a platform as a service and the options are MySQL database for us your
app service as your VMs or is the answer none of these okay so you want to pause
the video and think about it so did you guess the answer so yes the answer is
your VMs the next question is can be deployed as your VM without a virtual
network so the options are yes/no or none of these so what do you think so
yes you get it right the answer is no right let's now move on to a third
question which is what is Azure Active Directory useful is it a monitoring B
Identity Management C automation or D none of these now guys I will not give
you the answer for this particular question so you'll have to answer it to
me in the comment section below right let us first understand what exactly is
in Azure certification so this certification is a level of Microsoft
Azure cloud expertise that an IT professional obtains of the buzzing one
or more exams that microsoft offers basically it is to demonstrate and
validate the technical cloud knowledge and scales one has obtained now what are
those certifications that we shall discuss as we move forward here I'm
going to discuss major is your certifications and after
that we will quickly discuss about them now moving forward first one is
Microsoft Certified as your administrator the associate thatÃs easy
103 next one the is your developer associate that as is e 203 third one the
Microsoft Certified is your Solutions Architect that comes under the expert
level that is easy three hundred and three hundred one last one is the data
Lake and data factory so it is usually to perform the data engineering on
Microsoft cloud services and commonly known as 7:07 census now let's have a
quick glance at these certifications so now for the easy 103 certification this
falls under the associate level with this certification Microsoft aims to
help the candidates learn and acquire a wide range of skills that are required
to be a cloud administrator such as managing various cloud services security
networking storage and many more previously a candidate had to pass easy
hundred and eighty hundred one certification exams in order to achieve
this certification but after receiving constant feedbacks from learners about
the exam being difficult and the two-month syllabus Microsoft learning
decided to merge these two certification exams and named it as a z103
certification exam with 70% of the slavers from a-z hundred and thirty
percent of the slavers from AZ 101 it's up next we have the AZ 203 so this is
another associate level role-based certification exam by microsoft azure
with this certification Microsoft aims to help the candidates learn all the
skills required in the development domains such as designing and building
cloud applications services and many more before easy 203 certification exam
was introduced easy 200 and 201 were in the picture in order to get certified as
an assured associate developer but both of these exams got retired and easy 203
took their place right I mean after receiving constant feedback from
learners about the exam being difficult and with too much syllabus Microsoft
learning decided to take the similar step for the certification exam as well
203 exam came into the picture and this example is in approximately 70 percent
of its objectives from the easy 200 exam and approximately 30 percent of its
objectives from the easy 201 itself now moving forward to the another exam
that'll the AZ 300 and 301 itself this is the first row will be a certification
exam that is launched at the expert level with this certification Microsoft
aims to help the candidates learn the most advanced is your skills along with
learning to design a secure reliable and scalable solution for businesses after
the certification a candidate is expected to have gained expertise in
computer networking security and storage even though the certification also
covers the skills of the edge or administrator and as your developer
associate level certifications but they are not a mandatory prerequisite for the
certification in order to achieve the certification you will have to pass the
following exams which are is the 300 that is Microsoft is your architect
technologies and AC 301 that is Microsoft is your architect design so
again these two above exams are the replacement for the old 7 0 5 3 5 8
sound that is architecting Microsoft edge or solutions now moving forward
that is the 7 0 7 7 6 certification exam so the certification requires candidates
to get accustomed with implementing big data engineering on a short so the
learners must have skills on Microsoft Azure SQL data warehouse as your data
like analytics as your data factory an agile stream analytics and subsequently
perform Big Data best practices using the same as informed by the Microsoft
learning the certification names 7 0 7 7 6 is about to be expired and no
replacement exam name is being given to it yet however the replacement exam
would consist of all the topics from the current curriculum with a few topics to
move or merge with a different topic right so let's move forward now let's
move forward so lies there is a huge amount of witness when it comes to
choosing who should go forward certification right so moving forward
with the video let us discuss which certification is
for whom so for easy 103 exam candidate taking this exam are expected to have
sufficient knowledge of various services across full IT lifecycle applications
other environments at least one year of experience in an IT administration and
in hands-on with seller provisioning monitoring resource management is
recommended tougher easy 203 exam in order to pass this exam the candidate
must have at least one year of experience of developing scalable
solutions in nosing in all phases of software development and the candidate
must be skilled in at least one cloud supported programming language so this
exam requires a candidate to be proficient in software development
places such as solution designing development deployment testing and
maintenance now we have AZ 300 and 301 exam if we first talk about easy 300
exam that a candidate must have expert level skills in at least one of the
expert level domains this exam requires a candidate to have some knowledge of
various concepts and idea operations such as networking virtualization
business continuity data management and disaster recovery and for easy 301 in
order to pass this exam the candidate must be skilled in a short
administration and edge order bla bla so skills and the webs are also recommended
so the candidates taking this exam are expected to be able to build a short
solutions upon to the business requirements such as making decisions
that help the business become secure solutions making the business more
scalable now moving forward to seven zero seven seven six so the
certification is for the candidates who design analytic solutions and then
operationalize solutions on Asia candidates who are familiar with the
features and capabilities of Bath's data processing real-time processing and
operationalization technologies and of course the data engineers now let's move
forward so guys in order to be well prepared for agile certification exams
you must know what are the objectives of the exam right so following the same
order as before let's first discuss the curriculum for a z103 exam due to
constant feedbacks about the exam being difficult
Microsoft has divided the entire example syllabus into five modules so in order
to ease the exam preparation let us discuss each of these modules below
manage is your subscriptions and resources so as this percentage figure
that you see here basically indicates the relative weight edge of the
quotients from each of them the percentage is higher than you should
expect more questions from that module and my server self so this module you
need to be thoroughly prepared with in managing as your subscriptions the
source groups analyzing resource utilization and
consumption and managing role based access control that is our BSC second we
have implemented merit storage where it holds the weight is of 15 to 20 percent
in the exam covering the topics like creation and configuration of storage
accounts as all files importing and exporting data to a shore and
implementing your backups third deploying and managing virtual
machines holding a weight is from 15 to 20 percent covering topics and creation
and configuration of virtual machines for Windows and Linux managing as your
VMs and automating the deployments and managing the backups third we have
deploying and managing virtual machines holding away days from 15 to 20 percent
covering topics and creation configuration of virtual machines for
Windows and Linux managing is your VMs and automating their deployments and
managing their backup fourth one is configure and manage virtual networks
which holds a majority of wait age of around 30 to 35 percent and govern
topics in create connectivity between virtual networks implementing and
managing virtual networking network security groups is your load balancer
monitoring and troubleshooting virtual networks integrating on-premises network
with a job virtual network and the first one and the last one is manage
identities holding a weight age from 15 to 20 percent in the end zone
merging is your Active Directory and ad objects users groups and devices and
dementing and managing hybrid identities and multi-factor authentication are
being covered in this model so now for AC 203 exam so as the entire exam
syllabus has been divided into six modules based on the feedback circuit so
developing is your infrastructure as a service compute solutions which holds a
relative weight age of 10 to 15% in the module so this modules you need to be
thoroughly prepared with creating containerized solutions implementing bad
jobs by using as your batch services and implementing solutions that lose virtual
second we have developed as your platform the service compulsory Asians
where it holds the weight age of 20 to 25% MBA exam covering topics like
creating as your app services web apps and avi apps as your app service mobile
apps and implementing agile functions and the third one is a developing for
area students holding a weight age from 15 to 20% covering the topics and
developing solutions that use blob storage relational database cosmos DB
storage and storage tables and the full firm is an implementing agile security
holding a weight age from 15 to 20% and public topics and implementing access
control authentication and secure data solutions now the fifth learner's
monitoring troubleshooting and optimizing edge oscillations holding
away dates from 15 to 20 percent in the exam instrumenting solutions to support
moulting and logging integrating caching and content delivery with the solutions
and developing code to support scalability of apps and services are
being covered in this module and the last one is connect to and expend as
your services and third-party services which holds a weight age of 20 to 25% in
the exam and covering topics in establishing ABI gateways integrating as
your search within solutions I'm developing an app service logical and
similarly for easy 300 and 300 on exam it covers topics and deploy and come to
the infrastructure implement workloads on security create and deploy apps
implement authentication and secure data develop for the cloud and is your
storage data mine workload requirements then after this comes a designing path
but as a design for identity and security design a data platform solution
designer business continuity strategy designed for deployment migration and
integration design an infrastructure strategy now at last four seven zero
seven senses though the certification is on the words of retirement but its
latest is not going to be any different hollow through transition will be there
like the removal of few topics or just merge them with any topics right so if
we talk about is objectives then design and implement complex even processing by
using a short stream on allocates holding a weight edge from 15 to 30
percent which includes the topics like ingest data for real-time processing
design and implement edge or stream analytics implemented manage the
streaming pipeline query real-time data by using the a short stream analytics
query language so the next one is a design and implement azure SQL data
warehouse solutions again what is the weight is from 15 to 20 percent
including the topics like design tables in Azure SQL data warehouse query data
in Azure SQL data warehouse indicate a short SQL data warehouse with the other
services then design and implement cloud-based integration by using age or
ability to implement data sets and linked services more transform and
analyze data by using Azure data factory activities orchestrate data processing
by using Azure data factory pipeline monitor and manage as your data Factory
other topics included in this now the last one is marathon maintain is your
SQL data warehouse is your data Lake is your data factory and as your scheming
architects holding a majority of it is in the exam that has twenty to
twenty-five percent so this model covers the topics and provision as your SQL
data warehouse is your data leak is your data factory stream analytics implement
authentication authorization auditing manage data recovery for edge or SQL
data warehouse as your data leak so this model covers of topics and provision as
your SQL data warehouse data like data factory stream analytics inhuman
authentication authorization or written design and implement storage solution
for big data and limitations so that's all for the objective part so now let's
move forward and discuss about the exam pattern followed by how you should
prepare for this exam do now the Microsoft Azure certification exams are
one of the most savaging exams in the ad industry although the number of
quotients in this exam are subject to change over time you can expect around
40 to 60 questions in this exam and also you can expect different quotients
formats and tie-ups in the six including review screen mock review
multiple choices short answer ha Taniya repeated answer choices drag and drop
case studies the best answer an active screen all the questions in this exam
can follow any of these question types we will get around 150 minutes to
complete the examination with an additional 30 minutes of sitting time
for the good results in the exam it is required to follow the weight associated
with each exam on you during your is your exam preparation the pricing
depends on which location you are taking your exam from for example if you are
taking your exam in Lewis say then it is going to cost you around 160 but if you
are taking it in India then you will have to pay an amount of 4,800 INR and
the pricing is subject to change without the notice from country to country other
pricing does not include applicable taxes you need to confirm with your
examination provider for exactly if you're a student then you can get a free
reduction in the exam if you can submit your valid education credentials more
although Microsoft Partner Network program member Microsoft trainers and
Academy programme members are also eligible for the reduced pricing most of
the Microsoft technical quotients require a passing score of 700 any score
greater or equal to 7 under will be my desk pass otherwise it will be marked as
field most of the short questions in this exam are one point it's any
question is worth more than that then it will be indicated in the quotient part
itself also note that there are no penalty for an incorrect answer the exam
is available only in English language now we know of the examination pattern
let us discuss few of the tips for the exam preparation how to prepare for this
exam the first step is to plan the module structure and study accordingly
for example if you are going to start with app services in containers then
make sure you have already covered the topics of feeding the app services
containers container images and audio files so that you know the basics of
implementing app services in the container or if you are going to
implement the a short backup then you must know about how to create and
configure the storage accounts along with the configuration of the Java files
so that you know the basics of implementing the configurations and
storage of them you would require a lot of portions in verified study materials
on Azure in order to earn good passing scores you can refer to the official
curriculum and study guide to plan your sorry and get the right information
after that practice simple test online once gained enough knowledge with this
then you should be able to step up into the next part of practicing hands off
and this is the most crucial part in your learning journey so else you can
easily perform hands on and a gorge services since a job provides a free
Terror option for newbies and you get a $300 of credit in your algorithm once
you know so in addition to that check for the online training so that in case
you have queries related to the subject then you can tell them right away and
experts help is always recommended last but not the least join the forums
related to Microsoft is short and search for positions related to your subject
and check out the questions asked by the other people and both of the answers
posted by audiences so it will always be helpful for you the solution from my end
does that go for dumb so even you are thoroughly prepared to the subject we
might be able to clear the exam and get certification for men but believe me
guys so won't be Joe penny as doing the interview without the practical approach
you won't be able to crack it or be able to answer the portions by interviewer
because G took the shortcut to get certified right so be aware of the short
codes to a Torah study with the proper implementations in Asia and dries up in
your career path alright so now you might be wondering how to take time for
all this research and study that is where we get in telepathic here for you
we understand that you as professionals face problems to take out time for
upscaling from the person life right so that is why we have done all the hard
work for you and have come up with the comprehensive courses on a-z 103 AZ 203
is it 300 and 301 and is your data leak and data factory and each course is
curated by the industry experts which includes a different case studies and
assignments based on the each module along with the industry oriented
projects if you are interested then you can go through the course details with
the link provided in the description box how is he enter 3 came into existence so
this previously Asher had launched two new role-based certifications that is
easy hundred and easy hundred one as a replacement of 75354 the Microsoft edge
or administrator role but recently on May 1st 2019 both TSE hundred and 101
exams got retired and replaced by AC under three exam so if one is willing to
become a Microsoft certified as your administrator then they should focus on
easy hundred three exam preparation that is how it came into existence so all of
a sudden transition from easy hundred and hundred one towards a new AZ hundred
three certification there must be a reason behind this change right that we
shall discuss as we move forward so as in the early stages of the microsoft
certified is your administrator associate certification Microsoft
learning received a lot of negative feedbacks about the new is a hundred and
hundred one exams being very difficult for people to craft based on the
feedbacks they decided to combine the AZ one hundred and hundred one
certification exams into a single easy hundred three certification exam that is
Microsoft is your administrator certification exam the combined AC 103
exam is not merely a merging of the full is a hundred and is a hundred one
certification exam the AC hundred three examples in
approximately seventy percent of its objectives from the easy hundred exam
and approximately thirty percent of its objectives from the AC hundred one
example as a result of the transition the AC hundred and hundred one
certification exams are now retired as of May 1 2019 and are no longer be able
to schedule or be taken additionally the transition exam AZ
under 2 is also retired as a single AZ hundred three exam is the singular part
to earning the microsoft certified as your administrator associate
certification for those of you who are early achievers out there I understand
this may be a bit frustrating for you but the stage will simplify the
certification process to attain it and is a good change to be made right next
is why you should offer AZ under three certification so today Microsoft as your
administrator is ruled out to be one of the most esteemed job titles in the
cloud industry how since Azure is one of the pioneers in providing the cloud
services and it has got majority of its shares in the cloud market worldwide
after UW's which means greater job opportunities and no doubt it has been
majorly used by majority of the fortune 500 companies morogo is your
administrator profile guarantees attractive pay packages along with a
solid career growth opportunities according to the latest report the
average pay often is your administrator ranges between 100k dollars per annum
and an experienced Microsoft edge your administrator can earn or 220k dollars
per annum so I guess this should be enough to satisfy a candidate who is
willing to go for a z hundred three exam preparation moving forward in order to
be well prepared for a Z hundred three exam you must know that what are the
prerequisites required for the exam right so you need to have at least
intermediate server administration knowledge and skills should have a basic
knowledge of PowerShell and a journal familiarity with the cloud computing
concepts now let's move forward and discuss about the exam objectives so
there is due to constant feedbacks about the examining difficult
Microsoft has divided the entire exam syllabus into five modules in order to
ease the exam preparation so let's discuss each of these modules below
manages your subscriptions and resources so this percentage figure that you see
here basically indicates the relative weight edge of the quotients from each
of modules if the percentage is higher then
you should respect more quotients from that module and vice versa so we're in
this module you need to be thoroughly prepared with managing as your
subscriptions resource groups analyzing resource utilization and consumption and
managing role based access control second implement and manage storage
where it holds the Vantage of 15 to 20 percent in exam covering the topics like
creation configuration of storage accounts age or files importing and
exporting data to Azure and implementing as your backups third is deploying and
managing virtual machines holding weight age from 15 to 20 percent covering
topics and creation and configuration of virtual machines for Windows and Linux
merging is your virtual machines and automating their deployments and
managing their backups fourth one is configure and manage virtual networks
which holds a majority of weight is around 30 to 35 percent I'm covering
topics in create connectivity between virtual networks implementing and
managing virtual networking network security groups as your load balancer
monitoring and troubleshooting virtual networks integrating on-premises network
with Azure virtual network fifth and the last one is manage identities holding
your weight edge from 15 to 20% in the exam so managing is your Active
Directory and Active Directory objects users groups and devices implementing
and managing hybrid identities and multi-factor authentication are being
covered in this module now let's move forward and discuss about the exam
pattern no doubt the easy hundred three exam is one of the most challenging
exams in the IT industry right although the number of questions in this exam are
subject to change over time you can expect around forty to sixty quotients
in this exam and also you can expect different quotients formats and types in
this exam including review screen mark review multiple choices short
answers hot area repeated answer choices drag and drop case studies the best
answer and active stream all the questions in this exam can follow any of
these question types you will get around 150 minutes to complete the examination
with an additional 30 minutes of setting time for good results in the exam it is
required to follow the weight associated with each exam module during your AC
under 3 exam preparation the pricing depends on which location you are taking
your from for example if you're taking your
exam in US then it is going to cost you around $165 but if you're taking it in
India then you will have to pay an amount of 4,800 Indian rupees and the
pricing is subject to change without a notice from country to country as a
pricing does not include applicable taxes you need to confirm with your
examination provider for an exact fee if you're a student then you can get a free
deduction in the exam if you can submit your valid educational credentials
moreover Microsoft Partner Network program member Microsoft trainers an
academy program members are also eligible for reduced pricing so as most
of the technical portions require a passing score of 700 AC under 3 is not
an exception any score greater or equal to 700 will be marked as passed
otherwise it will be marked as filled most of the short questions in this exam
are 1.0 if any question is worth more than that then it will be indicated in
the example itself also note that there are no penalty for an incorrect answer
the exam is available only in English language now we know of the examination
pattern now let's discuss few other tips for the preparation on the exam how to
prepare for AC under 3 exam the first step is to plan the module structure and
study accordingly for example if you're going to implement is your backup then
you must also know about how to create and configure the storage accounts along
with configuration of azure files right so that you know the basics of
implementing the configurations in storage accounts right you would require
a lot of AC under 3 quotients and verified study materials in order to
earn good passing scores right you can refer to the official curriculum and
study guide to plan your module study and get the right information after that
practice sample tests online once gained enough knowledge with this then you
should be able to step up into the next part of practicing hands-on and this is
the most crucial part in your learning journey you can easily perform hands-on
and as your services since their job provides a free tier option for newbies
and you get a 300 dollars of credit in your as your account once you're signed
which is valid for almost a year practicing hands-on will not only help
you in cracking the exams but it will also get you ready for your future job
roles in addition to that check for the online training so that in case you have
queries related to the subject you can clear them right away and experts help
is always recommended last but not the least
forums related to Microsoft edge or and search for questions related to your
subject and check out the questions asked by the other people and go through
the answers posted by the audience itself it will always be helpful for you
last addition from my end desire go for dumps only when you are thought over the
subject you might be able to play the exam and get certification from it but
believe me guys you won't be job ready as during the interview without the
practical approach you won't be able to crack it or be able to answer the
questions asked by the interviewer because you took the shortcut to get
certified right so beware of the shortcuts do a thorough study with the
proper implementations in Azure and rise up in your career path now you might be
wondering how to take time for all this research and study that is why v8 and
telepath are here for you we understand that us professionals face
problems to take time out for scaling up from the person life that is why we have
done all the hard work for you and I've come up with a comprehensive course on
easy hundreth recertification exam so let's move forward
how is Z 203 came to existence so this before
203 certification exam was introduced ez 200 and 201 were in the picture in order
to get certified as an is your associate developer but both of these exams got
retired and AZ 203 took their place that is how it came into existence everything
happens for a reason and there must be a reason behind the change right that we
shall discuss as we move forward reason for the Sims so as in the early stages
of microsoft certified is your developer associate certification Microsoft
learning received a lot of negative feedbacks about the beta exam being very
difficult for people to crack which are AZ 280 201 based on the feedbacks they
decided to combine the AZ 280 201 certification exams into a single lazy
203 exam that is the developing solutions for Microsoft a short
certification exam so as the combined AZ 203 exam is not merely a merging of the
full is e 200 and AZ 201 certification exam so this AC 203 examples an
approximately 70% of its objectives from the easy 200 exam and approximately 30
of its objectives from the AZ 201 exam as a result of the transition the EZ 200
and 201 certification exams are now retired and I'll no longer be able to
schedule all protected additionally the transition exam ez 202
is also retired as a single ez 203 exam there's a singular part to earning the
Microsoft Certified is your developer associate certification and guys for
those of you who are early achievers out there this may be a bit frustrating for
you I know that but the stage will simplify the certification process to
attain it and is a good change to be made next why you should opt for AZ 203
certification so there's today Microsoft edge your developer is ruled out to be
one of the most esteemed job titles in the cloud industry how since George is
one of the pioneers in providing cloud services and it has got majority of its
shares in cloud market worldwide after AWS which means greater job
opportunities and no doubt it has been majorly used by a majority of the
fortune 500 companies according to the latest report the average P often as
your developer is more than $120,000 per annum and it isn't a secret that an
experienced Microsoft Azure developer can earn up to $200,000 per annum so I
guess this should be enough to satisfy a candidate who is willing to go for a Z
203 exam preparation moving forward in order to be well prepared for AZ 203
exam you must know that what are the prerequisites for the exam right so you
need to have at least four years of experience in development of scalable
solutions that span through the entire software development lifecycle including
software design development deployment testing and maintenance you should have
a prior experience and knowledge with these your platform also you must have a
past experience in developing applications using the programming
languages such as C sharp Python etc now let's move forward and discuss about the
exam objectives due to the constant feedbacks about the
exam being difficult Microsoft has divided the entire exam
syllabus in six modules in order to ease the exam
preparation let us discuss each of these modules below developing is your
infrastructure-as-a-service computer solutions so guys this percentage figure
that you see here that is 10 to 15% here basically indicates the relative weight
is of the quotient from each of the modules if the percentage is higher then
you should expect more quotients from that module and vice versa
so in this module you need to be thoroughly prepared with creating
containerized solutions implementing bad jobs by using is your bad services and
implementing solutions that use virtual machines second develop is your platform
as-a-service compute solutions where it holds the weight is of 20 to 25 percent
in exams covering topics like creating a short app servers web apps and API apps
so also it covers the topic such as as your app services mobile apps and
implementing a short functions third developing is your storage holding
weight is of 15 to 20 percent covering the topics like developing solutions
that use blob storage relational database cosmos database storage and
storage tables fourth one is implementing edge or security holding a
weight is of 15 to 20 percent and covering topics and implementing access
control authentication and secure data solutions fifth one is monitoring
troubleshooting and optimizing as your solutions holding abilities of 15 to 20
percent in the exam instrumenting solutions to support monitoring and
logging integrating caching and content delivery with solutions and developing
code to support scalability of apps and services are being covered in this
module and the last one is connect to an expense is your services and third-party
services which holds evaders of 20 to 25 percent in exam and covering topics and
establishing API gateways integrating is your search within solutions and
developing an app service logic app now let's move forward and discuss about the
exam pattern no doubt the AC 203 exam is one of the most challenging exams in the
IT industry although the number of portions and this exam are subject to
change over time you can expect around 40 to 60 questions in this exam and also
you can expect different quotient formats and tie-ups in the sex out
including the viewscreen mark review multiple choices short answer hot area
repeated so choices drag and drop case studies
the best answer an active streak all the quotients in this exam can follow any of
these quotient binders you will get around 150 minutes to complete the
examination with an additional 30 minutes of setting time for good results
in the exams it is required to follow the weight is associated with the exam
module during your AC 203 exam preparation the pricing depends on which
location you are taking your exam from for an example if you're taking your
exam in USA then it is going to cause you around 165 dollars but if you are
taking it in India then you will have to pay an amount of 4,800 Indian rupees and
the pricing is subject to change without a notice from country to country as the
pricing does not include applicable taxes you need to confirm with your
examination provider for an exact fee if you're a student then you can get a fee
reduction in the exam if you can submit your valid education credentials
Microsoft trainers and Academy program members are also eligible for reduced
pricing most of the Microsoft technical quotients require a passing score of 700
easy 203 is not an exception error any school greater or equal to 700 will be
marked as false otherwise it will be marked as field so most of the short
quotients in this exam are any question is worth more than that
then it will be indicated in the exam pattern sir also note that there are no
penalty for an incorrect answer the exam is available only in English language as
we know of the examination pattern now let's discuss few of the tips for the
preparation of the exam how to prepare for AC 203 exam so there is the first
step is to plan the module structure and study accordingly for an example if
you're going to start with the app services and containers then make sure
you have already covered the topics for creating the app services containers
container images and aqua files so that you know the basics of implementing app
services in the container you would require a lot of a-z 203 questions and
verified study materials to earn good passing scores you can refer to the
official curriculum and study guide to plan your module study and get the right
information after that practice sample tests online once gained enough
knowledge then you should be able to step up into
the next part of practicing hands-on and this is the most crucial part in your
learning journey you can easily perform hands on an edge or services since a job
provides a free tier option for newbies and you get a $300 of credit in your
edge or account once you've signed which is valid for almost a year
practicing hands on will not only help you in cracking the exam but it will
also get you ready for your future job roles in addition to that check for the
online trainings so that in case you have queries related to the subject you
can clear them right away and experts help is always recommended last but not
least join the forums related to Microsoft a short and search for
questions related to your subject and check out the quotients asked by other
people and go through the answers posted by the audience itself it will always be
helpful for you so where is the last edition from my unless I go for dumps
only when you are taurah with the subject you might be able to clear the
exam and get certification from it but believe me you won't be job ready as
during the interview without the practical approach you won't be able to
crack it or able to answer a quotient by interview because you took the shortcut
to get certified right so beware of the shortcuts do a thorough study with the
proper implementations and I short and rise up in your career path now you
might be wondering how to take time for all this research and study right
that is why v8 and telepath are here for you we understand that us professionals
face problems to take time out for skilling up from the personal life right
that is why we have done all the hard work for you and have come up with a
comprehensive course on a-z 203 certification exam if you are interested
then you can go through the course details with the link provided in the
description box before ASA 300 and easier 301 was introduced 75 35 was the
examination conducted to certify as n as your architect let's see the reason
behind this change we all definitely have a good number of
applications on the field every time there's a new update for the application
you'll receive a pop-up notification asking you to update the application the
same way as your hand new features added to it every year once every time there
is an update the examinations also have to be updated
hence the transition the scope of the 75 35 exam was to focus exclusively on
design and architectural elements and eliminate practical implication details
but asura is at 300 and is at 301 focuses on Azure architect technologies
and Azure architect design respectively now as we
know why this transition has happened let's see what is thought in Azure is at
300 in assured is at 300 and 301 the test consists of 139 questions as or is
at 300 follows the following objectives the deploy on configure infrastructure
module consists 34% of questions in the examination implement workloads and
security module as 23% of the questions based from this module create and deploy
apps has 10% of the questions in the examination implement authentication and
secure data eight-person of the questions are based on this domain
develop the cloud has 25% of the questions pasted from this domain in the
examination next let's see what is thought in Azure 301 and what percentage
of questions are asked from the each module buzzer is a 300 follows the
following objectives determining workload requirements which has 10 to
15% of the questions based on this domain designing for identity and
security has 20 to 25% designing a data platform solution has 15 to 20%
designing a business continuity strategy has 15 to 20% designing for deployment
migration and integration has 10 to 15 cent of the questions and last designing
an infrastructure strategy has 15 to 20 percent of the questions based on this
domain in the examination next let's see on why you should learn or opt for is a
300 and 301 to become an azure solution architect expert you need to get
certified by Microsoft what happens once you get certified your chances of
getting hired gets increased by twenty five to thirty percent also in addition
to that you also get paid more compared to the peers who are not certified next
let's see what are the prerequisites required to learn is at three hundred
and is a three hundred and one for admin you will need basic networking knowledge
basic OS knowledge knowledge of PowerShell basic understanding of Active
Directory concepts including domain users and domain controllers and basic
understanding of database concepts including table and simple queries for
development you will need experience with azure
programming in at least one as your supported language which is C sharp C
sharp script JavaScript Java or Python in addition to that a minimum one year
of experience in developing scalable solutions through phases of software
development is also required to learn easy at three hundred and is a 301 at a
faster and better rate now let's move on to the basics required for the
examination the ACA 300 and is a 301 exams are one of the most challenging
exams in the IT industry although the number of questions in the ACA 300 and
exam is subject to change over time you can expect around 40 to 60 questions
if you are taking this exam now you can expect questions formats and types in
this exam including review screen mark review multiple choice short answer hot
area repeated answer choices drag and drop key studies will list the best
answer and active screen just like in the case of number of questions it is
not necessary that you will get all these types of question in the exam but
all the questions in the exam will follow any of
these question types you will get around 115 minutes to complete the examination
with an additional 30 minutes of sitting time for good results in the exam it's
required to follow the weight associated with each exam module during your ace at
300 exam preparation the pricing depends on which location you are taking your
exam from for example if you're taking the exam in u.s. then you'll have to pay
about 165 dollars but if you are taking it in India then you'll have to pay the
four thousand eight hundred rupees the pricing is subject to change without
notice from country to country as the pricing doesn't include applicable taxes
you need to confirm with your examination provider for an exact fee if
you are a student you are eligible for a free reduction in the exam if you can
submit your valid educational credentials Microsoft Partner Network
program members Microsoft trainers and Academy program members are also
eligible for reduced pricing most of the Microsoft technical questions require a
passing score of 700 is at 300 and 301 is not an exception any score greater
than or equal to 700 will be marked as pass otherwise it will be marked as fail
most of the short answer questions in this exam are worth 1 point if any
question is worth more than that it will be indicated in the question part itself
also note that there are no penalty for incorrect answers the exam is available
only in English language as we now know what the examination pattern is let's
see few tips to prepare for the exam how to prepare for is a 300 and is a 301 the
first step to take when constructing your study plan is to plan accordingly
and knowing your modules you will need a lot of a solid 300 and is at 301
questions and verified studying materials to earn a good passing score
in ace at 300 and 301 examination refer the official study guide to get the
right information once you're done with that practice sample tests on
with this knowledge I think it is safe to step up your game by practicing
hands-on this is the most important step in your learning journey are you
wondering how you would perform hands-on for the other services let me tell you
how as you provide a free tier option for newbies you get a $300 credit and
your Azure account once you sign up which is valid for almost a year this
will not only help you in cracking the exam but it will also get you ready for
your future job role in addition to that check for trainings online so that in
case you have any queries related to the subject you can clear them right away
and experts health is always recommended last but not the least join forums
related to Microsoft Azure look up for questions pertaining to your subject
check out the questions asked by other people and also go through the answer it
may always come in handy at last a suggestion if you search there are
numerous shortcuts to getting yourself certified in any particular exam like
going through exam dumps or previously conducted as your exams although that
might help you crack the exam it will take away the experience you get when
you learn everything in-depth and believe me I've been to and conducted
numerous interviews for as your profiles people who get certified they are not
job ready why because they took the shortcut to get themselves certified so
beware of all these shortcuts do a thorough study with proper
implementations in Azure and rise up in your career ladder now you might be
wondering how to take time out of all this research and study that is where we
at in telepathy offered we at in telepath understand that you as
professionals face problems to take time out from skilling up from your personal
life that's why we have done all the hard work for you and have come up with
a comprehensive course on is at 300 and is at 301 certification eggs
if you are interested you can go through the course details with the link
provided in the description box okay guys we've come to the end of this
session I hope this session on Microsoft Azure
was useful for you and if you have any doubts feel free to comment over below
and be allowed to help you out thank you